Cannot allocate memory in Statefarm Notebook

In the Statefarm competition notebook, I compile the first model and run it like so:

def conv1(batches):
model = Sequential([
        BatchNormalization(axis=1, input_shape=(3,224,224)),
        Convolution2D(32,3,3, activation='relu'),
        BatchNormalization(axis=1),
        MaxPooling2D((3,3)),
        Convolution2D(64,3,3, activation='relu'),
        BatchNormalization(axis=1),
        MaxPooling2D((3,3)),
        Flatten(),
        Dense(200, activation='relu'),
        BatchNormalization(),
        Dense(10, activation='softmax')
    ])

model.compile(Adam(lr=1e-4), loss='categorical_crossentropy', metrics=['accuracy'])
model.fit_generator(batches, batches.nb_sample, nb_epoch=2, validation_data=val_batches, 
                 nb_val_samples=val_batches.nb_sample)
model.optimizer.lr = 0.001
model.fit_generator(batches, batches.nb_sample, nb_epoch=4, validation_data=val_batches, 
                 nb_val_samples=val_batches.nb_sample)
return model

model = conv1(batches)

I get the following error:

ERROR (theano.gof.cmodule): [Errno 12] Cannot allocate memory

This problem persists even if I restart the notebook and do garbage collection.

I can examine memory usage like so:

import psutil

def mem_usage():
    process = psutil.Process(os.getpid())
    return process.memory_info().rss/1e9

mem_usage()

And I get: 48.7248445, but have no sense of whether this is a lot or a little. I’m running a standard p2.xlarge instance. Has anybody else had this problem? What is the easiest way to fix it?

I just restarted jupyter and it worked fine. Also Jeremy says we should be using batches in this thread, ERROR (theano.gof.cmodule): [Errno 12] Cannot allocate memory