Meaning of an Epoch in Neural Networks Training

前端 未结 2 1764
南旧
南旧 2020-12-12 14:40

while I\'m reading in how to build ANN in pybrain, they say:

Train the network for some epochs. Usually you would set something like 5 here,

<
2条回答
  •  北荒
    北荒 (楼主)
    2020-12-12 14:56

    sorry for reactivating this thread. im new to neural nets and im investigating the impact of 'mini-batch' training.

    so far, as i understand it, an epoch (as runDOSrun is saying) is a through use of all in the TrainingSet (not DataSet. because DataSet = TrainingSet + ValidationSet). in mini batch training, you can sub divide the TrainingSet into small Sets and update weights inside an epoch. 'hopefully' this would make the network 'converge' faster.

    some definitions of neural networks are outdated and, i guess, must be redefined.

提交回复
热议问题