"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

August 24, 2018

Day #122 - Tensorflow Estimator API

  • Manage data distribution for out of box
  • Data Parallelism - Replicate your model on multiple workers
estimator = tf.estimator.LinearRegressor(...)
tf.estimator.train_and_evaluate(estimator,....)

Needed for running on multiple machines

#1. Estimator
#2. Run Config
#3. Training Spec
#4. Test Spec

estimator = tf.estimator.LinearRegressor(feature_columns=featcols,config=run_config)
..
tf.estimator.train_and_evaluate(estimator,train_spec,eval_spec)

#5. Checkpoints, Summary

run_config = tf.estimator.RunConfig(model_dir=output_dir,save_summary_steps=100,save_checkpoint_steps=2000)
estimator = tf.estimator.LinearRegressor(config=run_config,....)

#6. Using Data Sets

train_spec = tf.estimator.TrainSpec(input_fn=train_input_fn,max_steps=5000)

#7. Eval Spec

tf.estimator.train_and_evaluate(estimator,train_spec,eval_spec)

#8. Evaluation Checkpoint

eval_spec = tf.estimator.EvalSpec(input_fn=eval_input_fn,steps=100,throttle_secs=600,exporters=...)

#9. Measure for Test data

tf.estimator.train_and_evaluate(estimator,train_spec,eval_spec)


Happy Learning!!!

No comments: