"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 24, 2019

Day #198 - Semantic deep learning: segmentation and regression - Jorge Cardoso - DeepA2Z

Great Talk and Good Summary

Key Lessons
  • Map Image from one space to another space
  • Vector mapped to output space
  • Regression - Continuous Value
  • All are types of Regression Problems
Deep Learning VS ML
  • Most work in ML is creating features
  • Mapping function feature space to classification
DL
  • DL learns features from Data
  • Key DL Components
  • Architecture - Loss Function - Optimizer (part of learning process)



Building Blocks
  • Activation Functions (Relu, Elu, SRelu, PreLU
  • Convolutional Layers (Dimensionality vs complexity)
  • Aggregation Layers (Pooling / Convolutions)



  • Convolution Example
  • Input X Kernel = Output
  • Kernels are matrices




  • Backprop apply gradient descent
  • Weighted matrix multiplication is convolution
  • Standard Convolution 3x3 kernel
  • Dialted Convolution - Spacing between elements of kernel (Bring Context and Relationships)
  • Strided Convolution (Apply convolution every X number of Pixels)
  • Unpooling - Upsampling Image
  • Residual Connections (apply a little + plus)
  • Drop Out, Batch Norm (Form of Regularization)
  • Classification (K Classes)
  • Scale -> Depth 
  • Conv + Relu + Pooling
  • Extract Features that scale
  • Segmentation (Input -> Output Same cardinality)



  • Unet Based Approach 
  • Vnet (Residual Model)
  • DeepMedic (Downsample / Crop) + Merge Later
  • HighResNet (Right Features / Right Scale / Learn Relationships)



  • Segmentation Task
  • Tips and Tricks
  • Data Augmentation to avoid overfitting





  • Hyper Parameter Tuning (Grid Search, Random Search, Bayesian Optimization)


  • Abstraction Layers for missing inputs
  • Uncertainty of networks



Happy Mastering DL!!!

No comments: