State Farm Full: how not to run Out of Memory with VGG + (da_batches.samples*5)?

your batch tries to augment all of your images at once and when you add *5, it is getting too big. I think augmentation is happening in CPU And your 16GB of RAM can’t handle it.

look at https://keras.io/preprocessing/image/
try to do it manually in smaller batches that your machine can handle

for e in range(epochs):
    print('Epoch', e)
    batches = 0
    for x_batch, y_batch in datagen.flow(x_train, y_train, batch_size=32):
        model.fit(x_batch, y_batch)
        batches += 1
        if batches >= len(x_train) / 32:
            # we need to break the loop by hand because
            # the generator loops indefinitely
            break

I put all the data in an variable first and then use this, but you should be able to read the batches from directory

3 Likes