Session #1 - Content Marketing
- Distribute relevant consistent content. Traditional vs Content Marketing
- Delivering content with speed. Channel proliferation (mobile, computers, tablets)
- Intersection of Brands, Trends, Community Interests (Social media post and metrics)
- Data from social media pages, online aggregators
- Computation of term frequency, inverse document frequency
- Using Solr, Lucene for Indexes
- Cosine Similarity
- Greedy Algorithm
- Prediction vs Reasoning problem
- Prediction Problems Evolution
- At Advanced level Deep Learning, XGBoost, Graphical models
Features as input -> Prediction performed (Independent, stateless)
Reasoning - Sequential, Stateful Exploration
Reasoning Problems - Diagnosis, routes, games, crossing roads
Flavours of Reasoning
- Algorithmic (Search)
- Logical reasoning
- Bayesian probabilistic reasoning
- Markovnian reasoning
{subject, predicate, object}
Session #3 - Continuous online learning
- 70% noise in C2B communication
- 100% noise in B2C communication
- Zipfian
- Apriori - Market Basket Analysis
- XGBoost - Alternative to DL
- Bias - Variance Tradeoff
- Spectral Clustering
- Google Deepmind (Used for Air conditioning)
- Bayesian Probabilistic Learning
- Deep Learning - Build Hierarchy of features (OCR type of problems)
- Traditional Neural Network (Fully Connected, lot of degree of freedom)
- Structural causality (Subsystem appears before, Domain knowledge)
- Temporal causality - This and then that happened
- CNN - learning weights
- Spectral clustering
- PCA (reduce denser to smaller)
- Deep Learning - Hidden layers obtained through coarse grained process
- Neural Networks
- Multiple Layers
- Lots of data
Deep Learning now
- Speech recognition
- Google Deep Models on Phone
- Google street view (House numbers)
- Imagenet
- Captioning images
- Reinforcement learning
- Simple mathematical units combine into complex functions
- X-> input, W-> weights, Non linear function of output
- Multiple hidden layers between input and output
- Training hidden layers is challenge
- Define loss function
- Minimize by moving along gradient
- Move Errors back through the network
- Chain rule conception
- Cafee - Configuration file
- Torch - Describe network in lue
- Theano - Describes computation, writes cuda code, runs and gives results
- Used for images
- Images are organized
- Apply Convolutional filter
- For Deep Learning GPU is important
- Convolution (Have all nice features retain them)
- Pooling (Shrink image)
- Softmax
- Other
LSTM (Long Short Term memory)
Interword relationships from corpus (word2vec)
Happy Learning!!!
No comments:
Post a Comment