CNN Notes
- In a CNN lower layers learn generic features like edges, shapes and feed it to higher layers
- Earlier layer - Generic features
- Later layer - Features specific to the corresponding problem
- For any related problems we can leverage existing network VGG16, VGG19, Alexnet and modify the higher layers based on our need
- Relu only passed those in Activation function where its > 0
- Vanishing gradient problem - Weights will stagnate over a period of time
- 6E/6W - Gradient Error with respect to weights
- 6E/6I - Gradient Error with respect to Image
- Main things is weights same across RNN
- Weights between successive layers same
- Document Classification, Data Generation, Chatbot, Time series - RNN can be used
Topics from Language Modelling class
Happy Learning!!!