"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

December 17, 2018

Day #165 - Adaboost

Key Summary
  • Combine all ML methods to make a strong classifier
  • 1NN, KNN, SVM, Neural Nets, Naive Based Classification, ID Trees - Weak or Imperfect classifiers
  • Combine them all together or some set of them (Ensemble or Aggregate Classifier)
  • More voting power when it classifies correctly
  • Negative voting power when it classifies wrongly
  • Pick the best weak classifier, Assign it voting power
  • Best - Classifier that makes fewest errors
  • Calculate Error rate for each of weak classifiers
  • Update weights to emphasise on points misclassified
  • Take misclassified points increase weights in the next round
  • Adding new weights so that it becomes one half (For misclassified points)
  • Classifiers may make overlapping errors
Summary
  • Initialize weights
  • Calculate error for each h
  • Pick the best h with smaller error rate
  • Calculate voting power for best weak classifier
  • Update weights to emphasize on points misclassified
This 165 post beats my all-time record of posts in a single year :). I hope to break a lot more useful learning in AI / ML / Research papers before the end of the year.

Knowledge is wealth, Patience is the character, Perseverance is Strength. Keep Learning and Keep Going!!!


Happy Mastering DL!!!

No comments: