"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

December 11, 2018

Day #161 - Tensorflow RNN APIs

Key Summary
  • Encoder (Original Text)
  • Decoder (Translated Text)
Learning's
  • Read Sequence Data
  • SequenceExample proto to store sequence data + context
  • Efficient Storage, Efficient parser
  • tf.parse_single_sequence_example
Batch Sequence Data
  • Static Padding - Pick Max Sequence length, Convert input to that length
  • tf.train.batch(...)
  • Dynamic Padding tf.train.batch(...,dynamic_pad=true)
  • Bucketing - tf.contrib.training.bucket_by_sequence_length(...,dynamic_pad=True)
State Saver - Truncated Back Propagation to Time
  • tf.contrib.training.batch_sequences_with_states(...)
RNN API
  • RNN is unit of computation repeated over and over
  • Inputs / Previous State / Intermediate calculation / Updated State
  • LSTM (Long Short Term Memory)
  • RNNCell (Provide Knowledge about specific RNN Architecture)
  • Represent a time step as a layer
  • Tensor Cells
  #BasicRNNCell, BasicLSTMCell, MultiRNNCell, GRUCell, LSTMCell, GridLSTMCell

Fully Dynamic Computation
  • tf.while_loop (dynamic loops + gradients)
  • tf.TensorArray (dynamic Tensor slice access, gradients)
  • Encoder (8 Stacked LSTM Layers)
Fused RNN Cells
  • XLA fused time steps
  • Manually fused time steps
  • Works everywhere fast on XOR (GPU, Android)
Dynamic Decoding
  • tf.contrib.seq2seq
Dynamic Decoder Goal
  • Pick Sampling method
  • Pick RNN Cells
  • Pick attention, in-grah beam search

Happy Mastering DL!!!

No comments: