Vgg16bn mnist ensemble stack: The shape of the input to "Flatten" is not fully defined

I’m experiencing this error message when loading the vgg16bn layer structure on the mnist data set.

ValueError: The shape of the input to “Flatten” is not fully defined (got (512, 0, 0). Make sure to pass a complete “input_shape” or “batch_input_shape” argument to the first layer in your model.

specifically, when I load only one group of layers (ConvBlock) the model compile and runs. If I attempt to load a second ConvBlock of layers, the above error message results.

#ConvBlock(3, 512) = 3 sets of layers with Convolution2D.filters=512
ZeroPadding2D((1, 1)),
Convolution2D(512, 3, 3, activation=‘relu’),
ZeroPadding2D((1, 1)),
Convolution2D(512, 3, 3, activation=‘relu’),
ZeroPadding2D((1, 1)),
Convolution2D(512, 3, 3, activation=‘relu’),
MaxPooling2D((2, 2), strides=(2, 2)),

jupyter notebook page below shows three models.

model 1: working cut down version of vgg16bn with only one ConvBlock(3, 512) group of layers [works]
model 2: full replication of vgg16bn layer stack.
model 2: cut down version of vgg16bn layer stack prior to failure at flatten layer for diagnosis.

My debugging methods have stalled. Tempted to port from keras 1.2.2 to current keras version 2.1.2 to see if this solves the problem.

using this layer structure on the kaggle digits-recogniser problem, wip results below which is close to the best non-cheating solutions posted.
acc : 0.993941638669
loss : 0.0300066995141
val_acc : 0.988809524037
val_loss : 0.106686315198

This happens when the size of your input image is too small. The neural network will cut the size of the input in half several times. If it expects a 224x224 image and resizes it 5 times then the final output size is 7x7. But if you give it a smaller image, say 28x28, then cutting it in half five times will give an image of size 0x0 (since 28/2**5 = 0.875 which rounds down to 0).

THANKS!!!

now I’m embarrassed to realise I should have recognised that scenario.

now I have to go back and relearn how that happens.