After class, students summary of backpropagation concept :)
Perspective #1 - Back Propagation is tuning the weights of a neural network based on the error rate obtained in the previous iteration
Perspective #2 - It is a process of updating the weights & bias at each layer to minimize the error rate
Perspective #3 - Forward propagation is moving forward step by step, backward propagation is adjusting the sails to move ones defined direction...
Perspective #4 - 1. Calculate the output by forwardprop, 2. Calculate the error, 3. Minimize the error by backprop, 4. Update parameter, 5. Repeat till converge
Perspective #5 - Backpropagation:method or algorithm to find the optimal value of weight and bias to minimise the loss function
Perspective #6 - we feed cumulative input to the neuron and apply activation func. compare the output to actual output and update weight and bias. repeat the cycle until correct output
Perspective #7 - basically to reduce the loss, we change the weights using forward and backward feeds
Keep Thinking!!!
No comments:
Post a Comment