Newton Raphson
Optimal Solutions
Happy Learning!!!
- Optimization Technique
- Newton's method tries to find a point x satisfying f'(x) = 0
- Between two successive approximations
- Stop iteration when difference between x(n+1) and x(n) is close to zero
- x(n+1) = x(n) - (f(x)/f'(x))
- Choose suitable value for x0
- Works for convex function
- x(n+1) = x(n) - af'(x)
- a - learning rate
- Gradient descent tries to find such a minimum x by using information from the first derivative of f
- Both gradient and netwon raphson are similar the update rule is different
Optimal Solutions
- Strategy to get bottom of the valley, go-down in steepest slope
- Measures local error function with respect to the parameter vector
- Once gradient zero you have reached the minimum
- Learning rate, steps to converge to a minimum
- How to converge to Global vs Local Minimum
- Gradient descent does guarantee local minimum
- Cost function elongated/circular decides on the convergence
Happy Learning!!!
No comments:
Post a Comment