Generating data for training in keras
问题 My training set is really quite large. The entire thing takes up about 120GB of RAM and so I can't even generate the numpy.zeros() array to store the data. From what I've seen, using a generator works well when the entire dataset is already loaded into an array but then is incrementally fed into the network and then deleted afterwards. Is it alright for the generator to create the arrays, insert the data, load the data into the network, delete the data? Or will that whole process take too