Due to the limitation of RAM memory, I followed these instructions and built a generator that draw small batch and pass them in the fit_generator of Keras. But Keras can\'t
I have figured it out.
My model was from keras.models import Model
However the generator was extended from class DataGenerator(tf.keras.utils.Sequence):
That causes the bug!
So, just change the Generator class DataGenerator(tf.keras.utils.Sequence)
to class DataGenerator(keras.utils.Sequence):
I received the same error saying my generator class which inherited from keras.utils.Sequence
object is not an iterator
.
Neither adding the __next__
method nor changing between keras.utils.Sequence
and tf.keras.utils.Sequence
helped.
For me, my __getitem__
class was not correctly implemented. While trying to use all the data, the last batch was a partial batch which I was not handling correctly. When I handled this correctly, the object is not an iterator
error went away. Thus, I suggest you carefully inspect your __getitem__()
implementation and consider it for all index
values passed to __getitem__()
.
I was having the same problem, I managed to solve this by defining a __next__
method:
class My_Generator(Sequence):
def __init__(self, image_filenames, labels, batch_size):
self.image_filenames, self.labels = image_filenames, labels
self.batch_size = batch_size
self.n = 0
self.max = self.__len__()
def __len__(self):
return np.ceil(len(self.image_filenames) / float(self.batch_size))
def __getitem__(self, idx):
batch_x = self.image_filenames[idx * self.batch_size:(idx + 1) * self.batch_size]
batch_y = self.labels[idx * self.batch_size:(idx + 1) * self.batch_size]
return np.array([
resize(imread(file_name), (200, 200))
for file_name in batch_x]), np.array(batch_y)
def __next__(self):
if self.n >= self.max:
self.n = 0
result = self.__getitem__(self.n)
self.n += 1
return result
note that I have declared two new variables in __init__
function.