I\'ve been trying to learn how back-propagation works with neural networks, but yet to find a good explanation from a less technical aspect.
How does back-propagatio
It is easy to understand if you look at the computation graph which gives how the gradient of the Cost function or Loss function wrto weight is calculated by Chain Rule (which is basically what back propagation is) and then the mechanism of adjusting every weight in the neural network using gradient descent, where the gradient is the one calculated by BackPropogation. That is proportionally adjusting each weight, based on how strong each weight is affecting the final cost. It is too much to explain here -but here is the link to the chapter https://alexcpn.github.io/html/NN/ml/4_backpropogation/ from my book in making https://alexcpn.github.io/html/NN/ which tries to explain this in a simple way.