I have next code:
from sklearn.model_selection import train_test_split
from scipy.misc import imresize
def _chunks(l, n):
\"\"\"Yield successive n-sized
A note about this issue in case others come to this page chasing it. The StopIteration bug is a known issue in keras that can be fixed, some of the time, by making sure that you set your batch size to an integer multiple of your number of samples. If this does not fix the issue, one thing that I have found is that having funky file formats that can't be read by the data generator will also sometimes cause a stopIteration error. To fix this, I run a script on my training folder that converts all of the images to a standard file type (jpg or png) prior to training. It looks something like this.
import glob
from PIL import Image
import os
d=1
for sample in glob.glob(r'C:\Users\Jeremiah\Pictures\training\classLabel_unformatted\*'):
im = Image.open(sample)
im.save(r'C:\Users\Jeremiah\Pictures\training\classLabel_formatted\%s.png' %d)
d=d+1
I've found that running this script or something like it drastically reduces my frequency of these sorts of errors, especially when my training data is coming off of somewhere like google image search.