It's always good to take a pause/revise / add a few more learning pointers :)
Key Notes
- ML operates by handcrafted features
- DL features learned directly from data
- Data prevalent, Parallelizable models / hardware, GPU/ CUDA, TF / Pytorch
- Activation functions and their differentiation
- Text - sequence of characters / words
- Stock prices / DNA sequences
- Temporal dimension to models
- Same series once for each Timestep
- Horizontal to vertical view
- Each output is connected/is input to the next timestamp
- Internal memory / state-maintained
- Individual Loss for each timestep
- Backprop for all timestamps
- Forward pass across time
- Back propagate through time
- Loss with respect to the internal state
- Attention
Ref - Course Link
Keep Thinking!!!
No comments:
Post a Comment