Keras - How are batches and epochs used in fit_generator()?

徘徊边缘 提交于 2019-12-18 12:14:10

问题


I have a video of 8000 frames, and I'd like to train a Keras model on batches of 200 frames each. I have a frame generator that loops through the video frame-by-frame and accumulates the (3 x 480 x 640) frames into a numpy matrix X of shape (200, 3, 480, 640) -- (batch size, rgb, frame height, frame width) -- and yields X and Y every 200th frame:

import cv2
...
def _frameGenerator(videoPath, dataPath, batchSize):
    """
    Yield X and Y data when the batch is filled.
    """
    camera = cv2.VideoCapture(videoPath)
    width = camera.get(3)
    height = camera.get(4)
    frameCount = int(camera.get(7))  # Number of frames in the video file.

    truthData = _prepData(dataPath, frameCount)

    X = np.zeros((batchSize, 3, height, width))
    Y = np.zeros((batchSize, 1))

    batch = 0
    for frameIdx, truth in enumerate(truthData):
        ret, frame = camera.read()
        if ret is False: continue

        batchIndex = frameIdx%batchSize

        X[batchIndex] = frame
        Y[batchIndex] = truth

        if batchIndex == 0 and frameIdx != 0:
            batch += 1
            print "now yielding batch", batch
            yield X, Y

Here's how run fit_generator():

        batchSize = 200
        print "Starting training..."
        model.fit_generator(
            _frameGenerator(videoPath, dataPath, batchSize),
            samples_per_epoch=8000,
            nb_epoch=10,
            verbose=args.verbosity
        )

My understanding is an epoch finishes when samples_per_epoch samples have been seen by the model, and samples_per_epoch = batch size * number of batches = 200 * 40. So after training for an epoch on frames 0-7999, the next epoch will start training again from frame 0. Is this correct?

With this setup I expect 40 batches (of 200 frames each) to be passed from the generator to fit_generator, per epoch; this would be 8000 total frames per epoch -- i.e., samples_per_epoch=8000. Then for subsequent epochs, fit_generator would reinitialize the generator such that we begin training again from the start of the video. Yet this is not the case. After the first epoch is complete (after the model logs batches 0-24), the generator picks up where it left off. Shouldn't the new epoch start again from the beginning of the training dataset?

If there is something incorrect in my understanding of fit_generator please explain. I've gone through the documentation, this example, and these related issues. I'm using Keras v1.0.7 with the TensorFlow backend. This issue is also posted in the Keras repo.


回答1:


After the first epoch is complete (after the model logs batches 0-24), the generator picks up where it left off

This is an accurate description of what happens. If you want to reset or rewind the generator, you'll have to do this internally. Note that keras's behavior is quite useful in many situations. For example, you can end an epoch after seeing 1/2 the data then do an epoch on the other half, which would be impossible if the generator status was reset (which can be useful for monitoring the validation more closely).




回答2:


You can force your generator to reset itself by adding a while 1: loop, that's how I proceed. Thus your generator can yield batched data for each epochs.




回答3:


Because the Generator is a completely separated function, it will go on with its infinite loop whenever it is called again.

What I can't justify is that fit_generator() will call the generator until it has enough samples. I can't find the variable batch_size, but there must be a criteria that sets an internal variable that defines the size.

I checked this while printing a state within each loop sequence:

def generator():

while 1:
    for i in range(0,len(x_v)-1):
        if (i != predict_batch_nr):
            print("\n -> usting Datasett ", i+1 ," of ", len(x_v))
            x = x_v[i] #x_v has Batches of different length
            y = y_v[i] #y_v has Batches of different length

            yield x, y


model.fit_generator(generator(),steps_per_epoch=5000,epochs=20, verbose=1)

Example output is:

4914/5000 [============================>.] - ETA: 13s - loss: 2442.8587
usting Datasett  77  of  92
4915/5000 [============================>.] - ETA: 12s - loss: 2442.3785
-> usting Datasett  78  of  92
-> usting Datasett  79  of  92
-> usting Datasett  80  of  92
4918/5000 [============================>.] - ETA: 12s - loss: 2442.2111
-> usting Datasett  81  of  92
-> usting Datasett  82  of  92


来源:https://stackoverflow.com/questions/38936016/keras-how-are-batches-and-epochs-used-in-fit-generator

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!