"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 24, 2018

Day #97 - Hyperparameter tuning

How to tune hyper parameters ?
  • Which parameters affect most
  • Observe impact of change of value of parameter
  • Examine and iterate to find change of impacts
Automatic Hyper-parameter tuning libraries
  • Hyperopt
  • Scikit-optimize
  • Spearmint
  • GPyOpt
  • RoBO
  • SMAC3
Hyper parameter tuning
  • Tree Based Models (Gradient Boosted Decision Trees - XGBoost, LightGBM, CatBoost)
  • RandomForest / ExtraTrees
Neural Nets
  • Pytorch, Tensorflow, Keras
Linear Models
  • SVM, Logistic Regression
  • Vowpal, Wabbitm FTRL
Approach
  • Define function that will run our model
  • Specify range of hyper parameter
  • Adequate range for search
Results
  • Underfitting
  • Overfitting
  • Good Fit and Generalization
Tree based Models
  • GBDT - XGBoost, LightGBM, CatBoost
  • RandomForest, ExtraTrees - Scikit-learn
  • Others - RGF(baidu / fast_rgf)
GBDT
  • XGBoost - max_depth, subsample, colsample_bytree, colsample_bylevel, min_child_weight, lambda, alpha, eta num_round, seed
  • LightGBM - max_depth / num_leaves, bagging_fraction, feature_fraction, min_data_in_leaf, lambda_l1, lamda_l2, learning_rate num_iterations, seed
  • sklearn.RandomForest/ExtraTrees - N_estimators, max_depth, max_features, min_samples_leaf, n_jobs, random_state

def xgb_score(param):
#run xgboost with parameters 'param'
def xgb_hyperopt():
space = {
'eta':0.01,
'max_depth':hp.quniform('max_depth',10,30,1),
'min_child_weight':hp.quniform('min_child_weight',0,100,1),
'subsample':hp.quniform('subsample',0.1,1.0,0.1),
'gamma':hp.quniform('gamma',0.0,30,0.5),
'colsample_bytree':hp.quniform('colsample_bytree',0.1,1.0,0.1),
'objective':'reg:linear',
'nthread':28,
'silent':1,
'seed':2441,
'early_stopping_rounds':100
}
best = fmin(xgb_score,space,algo=tpe.suggest,max_evals=1000)
view raw hyperparam.py hosted with ❤ by GitHub
Happy Learning!!!

No comments: