Looping through training data in Neural Networks Backpropagation Algorithm

99封情书 提交于 2019-12-05 02:09:04

Training the network

You should use each instance of the training set once per training epoch.

A training epoch is a complete cycle through your dataset.

After you've looped through the dataset and calculated the deltas, you should adjust the weights of the network. Then you may perform a new forward pass on the neural network and do another training epoch, looping through your training dataset.

Graphical representation
A really great graphical representation of backpropagation may be found at this link.


Single-step training

There are two approaches to train you network to perform classification on a dataset. The easiest method is called single-step or online learning. This is the method you will find in most litterature, and it is also the fastest to converge. As you train your network you will calculate the deltas for each layer and adjust the weights for each instance of your dataset.

Thus if you have a dataset of 60 instances, this means you should have adjusted the weights 60 times before the training epoch is over.

Batch training

The other approach is called batch training or offline learning. This approach often yields a network with a lower residual error. When you train the network you should calculate the deltas for each layer for every instance of the dataset, and then finally average the individual deltas and correct the weights once per epoch.

If you have a dataset of 60 instances, this means you should have adjusted the weights once before the training epoch is over.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!