"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

May 20, 2016

Day #23 - Newton Raphson - Gradient Descent

Newton Raphson
  • Optimization Technique
  • Newton's method tries to find a point x satisfying f'(x) = 0
  • Between two successive approximations
  • Stop iteration when difference between x(n+1) and x(n) is close to zero
Formula
  • x(n+1) = x(n) - (f(x)/f'(x))
  • Choose suitable value for x0
Gradient Descent
  • Works for convex function
  • x(n+1) = x(n) - af'(x)
  • a - learning rate
  • Gradient descent tries to find such a minimum x by using information from the first derivative of f
  • Both gradient and netwon raphson are similar the update rule is different
More Reads - Link

Happy Learning!!!

No comments: