"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

December 04, 2018

Day #158 - Class Notes - Deep Learning Limitations and New Frontiers

Key Lessons
  • Data as input, Feature extraction done on data, Patterns learnt from the data to predict decisions (Prediction / Action)
  • Generate new synthetic data (GAN)
  • Universal Approximation Theorem - A feedforward network with a single layer is sufficient to approximate an arbitrary precision, any continuous function (1989) - Any problem reduced to set of inputs and outputs
  • Neural networks because of non-comvex structure training is difficult
Limitations
  • Understanding Deep Neural Networks requires rethinking generalizations
  • Disparity between training and testing means not able to generalize only memorize
  • Modern Deep Networks can perfectly fit random data
  • Neural networks are excellent function approximators
Breaking Neural Networks
  • Modify pixels in specific location to decrease accuracy as much as possible
Neural Network Limitations
  • Data Hungry
  • Computationally intensive
  • Fooled by Adversarial examples
  • Poor at representing uncertainty
  • Uninterpretable black boxes
Bayesian Deep Learning
  • Notion of probability/uncertainty is different
  • NN trained to produce probability at output, They are not trained to produce uncertainty values
  • Rewrite posterior using Bayes rules, In practice difficult to compute
  • Approximate through sampling
  • Elementside Dropout for Uncertainty 
  • Dropout as a way to produce reliable uncertainty for Neural Network
  • Variance over output given uncertainty measure
Learning to Learn
  • Automated ML framework that will learn to learn
  • Controller (RNN) - Sample architectures of NN
  • Training Data - Sampled Network - Predicted Labels 
  • Design AI Algo that can build new models capable of solving the task


Happy Mastering DL!!!

No comments: