"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 21, 2017

Day #52 - Deep Learning Class #1 Notes

AI - Reverse Engineering the brain (Curated Knowledge)
ML - Machine Learning is subset of AI. Teaching Machine to Learn

Deep Learning - Rebirth of Neural Networks
  • Multiple layer of neurons
  • Directed Graph
  • First Layer is input layer
  • Last Layer is output layer
  • Intermediate layer is hidden Layer
  • Deep Learning is inspired by human brain
  • In Deep Learning features are learnt
  • Gradient Descent - Process of making updates in NN
  • Neural Networks is discriminative approach
  • Neurons in neural networks end up in becoming feature selectors
Discriminative Classifiers - Logistics, SVM (uses kernel for non-linear classification), Decision Trees
Generative Model - Naive Bayes

Types of Neural Networks
  • Autoencoders for dimensionality reduction
  • CNN Convolutional NN
  • RNN Recurrent NN
Interesting Deep Learning Demo Sites Discussed
imsitu.org
cloudcv.org

Happy Learning!!!

January 18, 2017

Neural Networks - Learning Resources

Happy Learning!!!

Interesting Data Science Projects

Happy Learning!!!

January 13, 2017

Day #51 - Neural Networks

Happy New Year 2017. This post is on Neural Networks.

Neural Networks
  • ANN - inspired by biological networks, Modelling network based on neurons
  • Key layers - Input Layer, Hidden Layer, Output Layer
  • Neural networks that can learn - Perceptrons, backpropagation networks, Boltzaman machines,recurrent networks
  • In below example for XOR implementation we use backpropagation 

Implementation overview
  • Initialize the edge weights at random (we do not exact weights we chose randomly, By training we find exact values)
  • Calculate the error - we have some training data and some results - Supervised learning, Calculated output not logical output, Error term present 
  • Calculate the changes of edge weights and update the weights (backpropagation process), Calculate edge weight changes and update accordingly
  • Algorithm terminates when error rate is small
#Runs in Python 2.7.12
from pybrain.tools.shortcuts import buildNetwork
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
#XOR Operation
# 0,0 - 0
# 0,1 - 1
# 1,0 - 1
# 1,1 - 0
#2 - 2 Neurons Input
#3 - HiddenLayer
#1 - Output layer
neuralNetwork = buildNetwork(2,3,1)
#2 Dimension input, 1 Dimension Output
dataSet = SupervisedDataSet(2,1)
#Add Training Data
#x,y - 2D
#Output - 1D
dataSet.addSample((0,0),(0,))
dataSet.addSample((1,0),(1,))
dataSet.addSample((0,1),(1,))
dataSet.addSample((1,1),(0,))
#Trainer Backpropagation
trainer = BackpropTrainer(neuralNetwork, dataSet)
for i in range(1,10000):
trainer.train()
if(i%4000==0):
print('Iteration - ',i)
print(neuralNetwork.activate([0,0]))
print(neuralNetwork.activate([1,0]))
print(neuralNetwork.activate([0,1]))
print(neuralNetwork.activate([1,1]))
#Output
#('Iteration - ', 4000)
#[ 0.07299632]
#[ 0.92082515]
#[ 0.95759942]
#[ 0.04306811]
#('Iteration - ', 8000)
#[ 0.00045084]
#[ 0.99952651]
#[ 0.99974768]
#[ 0.00023939]
neuralNetwork = buildNetwork(2,6,1)
#Trainer Backpropagation
trainer = BackpropTrainer(neuralNetwork, dataSet)
for i in range(1,10000):
trainer.train()
if(i%4000==0):
print('Iteration - ',i)
print(neuralNetwork.activate([0,0]))
print(neuralNetwork.activate([1,0]))
print(neuralNetwork.activate([0,1]))
print(neuralNetwork.activate([1,1]))
#Output
#('Iteration - ', 4000)
#[ 0.07809688]
#[ 0.85013337]
#[ 0.89882611]
#[ 0.15568355]
#('Iteration - ', 8000)
#[ 0.0005494]
#[ 0.9988819]
#[ 0.99922662]
#[ 0.00120211]
#With more hidden layers convergence rate is faster

Happy Pongal & Happy Learning!!!