Generating data for training in keras

做~自己de王妃 提交于 2021-01-28 06:39:56

问题


My training set is really quite large. The entire thing takes up about 120GB of RAM and so I can't even generate the numpy.zeros() array to store the data.

From what I've seen, using a generator works well when the entire dataset is already loaded into an array but then is incrementally fed into the network and then deleted afterwards.

Is it alright for the generator to create the arrays, insert the data, load the data into the network, delete the data? Or will that whole process take too long and I should be doing something else?

Thanks


回答1:


You do not need to load the whole data at once, you can load just as much as your batch needs. Check out this answer.



来源:https://stackoverflow.com/questions/44785924/generating-data-for-training-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!