"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 13, 2019

Day #187 - Deep Learning Basics: Introduction and Overview - MIT

Well Explained, Great Introduction and covered the fundamentals and state of art of Deep Learning

Key Lessons
  • DL to extract patterns in data with little human effort (Automated Approach)
  • Evolution of Deep Learning with respect to years of new innovations
  • DL Tools 


  • Apply methodology requires organize data, label, select aspect of data to reval answers for our questions
  • CPU, GPU, TPU - Hardware for efficient execution 
  • Higher abstractions help people to solve it in lesser time 
  • DL - Higher and Higher level of abstraction of understanding patterns
  • History of Science is the Simpler representation of Ideas
  • DL remove human from the picture, Automate the feature learning
  • Problems are not formulated as Data Driven Learning
  • RNN to predict intent of players in scene
  • Visual understanding is hard


  • Regression is Continuous, Classification is Categorical


  • Sequence of inputs, Sequence of outputs
  • Forward Pass

  • Many levels of layers create the necessary layers
  • Many of the slides are summary and self-explanatory, Well explained

  • Activation functions, Loss functions
  • Backpropagation to adjust the weights, error flow backward to network

  • After Forward pass compute error
  • Backward Pass - Compute gradients
  • Learning rate how fast the network learns    




  • Generalize without memorizing
  • Normalize input data before feeding to network




  • CNN - Enables image classification



  • Find Candidates, bounding boxes
  • Region proposals - Loop over region proposals




  • Single shot - remove for loop (less accuracy for objects far or smaller)
  • Trandeoff performance vs accuracy




  • Semantic segmentation (pixel level boundaries of object)



  • Transfer Learning - Commonly Applied - Pretrained model -  Chop off the fully connected layer - New Dataset and retrain the network




  • Word2vec form representations of data, words far in euclidean sense


  • Challenge is learning long term context
  • Bidirectional RNN
  • Generalized Model Architecture for text, audio, video, chat etc
  •   





    Happy Mastering DL!!!

    No comments: