Epochs and Iterations in Deeplearning4j

你说的曾经没有我的故事 提交于 2020-01-15 03:21:28

问题


I recently started learning Deeplearning4j and I fail to understand how the concept of epochs and iterations is actually implemented. In the online documentation it says:

an epoch is a complete pass through a given dataset ...
Not to be confused with an iteration, which is simply one update of the neural net model’s parameters.

I ran a training using a MultipleEpochsIterator, but for the first run I set 1 epoch, miniBatchSize = 1 and a dataset of 1000 samples, so I expected the training to finish after 1 epoch and 1000 iterations, but after more than 100.000 iterations it was still running.

int nEpochs = 1;
int miniBatchSize = 1;

MyDataSetFetcher fetcher = new MyDataSetFetcher(xDataDir, tDataDir, xSamples, tSamples);
//The same batch size set here was set in the model
BaseDatasetIterator baseIterator = new BaseDatasetIterator(miniBatchSize, sampleSize, fetcher);

MultipleEpochsIterator iterator = new MultipleEpochsIterator(nEpochs, baseIterator);
model.fit(iterator)

Then I did more tests changing the batch size, but that didn't change the frequency of the log lines printed by the IterationListener. I mean that I thought that if I increase the batch size to 100 then with 1000 samples I would have just 10 updates of the parameters an therefore just 10 iterations, but the logs and the timestamp intervals are more or less the same.

BTW. There is a similar question, but the answer does not actually answer my question, I would like to understand better the actual details: Deeplearning4j: Iterations, Epochs, and ScoreIterationListener


回答1:


None of this will matter after 1.x (which is already out in alpha) - we got rid of iterations long ago.

Originally it was meant to be shortcut syntax so folks wouldn't have to write for loops.

Just focus on for loops with epochs now.



来源:https://stackoverflow.com/questions/50273031/epochs-and-iterations-in-deeplearning4j

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!