"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

November 14, 2017

Day #88 - Metrics Optimization

Loss vs Metric
  • Metric - Function which we want to use to evaluate the model. Maximum accuracy in classification
  • Optimization Loss - Easy to optimize for given model, Function our model optimizes. MSE, LogLoss
  • Preprocess train and optimize another metric - MSPE, MAPE, RMSLE
  • Optimize another metric postprocess predictions - Accuracy, Kapps
  • Early Stopping - Stop traning when models starts to overfit
 Custom loss functions

Accuracy Metrics





def logregobj(preds, dtrain):
labels = dtrain.get_label()
preds = 1.0/(1.0+np.exp(-preds))
grad = preds-labels
hess = preds*(1.0-preds)
return grad, hess
view raw customloss.py hosted with ❤ by GitHub
Happy Coding and Learning!!!

No comments: