Neural Network learning rate and batch weight update
I have programmed a Neural Network in Java and am now working on the back-propagation algorithm. I've read that batch updates of the weights will cause a more stable gradient search instead of a online weight update. As a test I've created a time series function of 100 points, such that x = [0..99] and y = f(x) . I've created a Neural Network with one input and one output and 2 hidden layers with 10 neurons for testing. What I am struggling with is the learning rate of the back-propagation algorithm when tackling this problem. I have 100 input points so when I calculate the weight change dw_