Loading weights fails when adding batch normalization

When adding batch normalization to a model, it does not seem possible to directly apply previously computed weights. An exception is thrown about weight mismatches, apparently batch normalization adds new layers to the model which changes its structure.

Here is the code I wrote to add batch normalization to fully-connected layers of VGG:

# New fully-connected model, identical to previous one, except without dropout but with batch normalization
fc_model = Sequential([
        MaxPooling2D(input_shape=conv_layers[-1].output_shape[1:]),
        Flatten(),
        Dense(4096, activation='relu'),
        Dropout(0.),
        BatchNormalization(),
        Dense(4096, activation='relu'),
        Dropout(0.),
        BatchNormalization(),
        Dense(2, activation='softmax')
    ])

# Copy the weights from the pre-trained model.
for l1,l2 in zip(fc_model.layers, fc_layers):
    # Set the weights in the new model without dropout.
    # NB: No need to resize the weights, Keras takes care of it
    l1.set_weights(l2.get_weights())

Here is the exception I get:

Exception                                 Traceback (most recent call last)
<ipython-input-26-57f2509e8d7b> in <module>()
     16     # Set the weights in the new model without dropout.
     17     # NB: No need to resize the weights, Keras takes care of it
---> 18     l1.set_weights(l2.get_weights())
     19 
     20 # Such a finely tuned model needs to be updated very slowly!

/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/engine/topology.pyc in set_weights(self, weights)
    877                             '" with a  weight list of length ' + str(len(weights)) +
    878                             ', but the layer was expecting ' + str(len(params)) +
--> 879                             ' weights. Provided weights: ' + str(weights)[:50] + '...')
    880         if not params:
    881             return

Exception: You called `set_weights(weights)` on layer "batchnormalization_7" with a  weight list of length 2, but the layer was expecting 4 weights. Provided weights: [array([[  3.90147837e-03,  -1.80806511e-03,   1.3...

Is there a simple way in Keras to transfer weights when adding batch normalization to a model? In the lesson 3 notebook, when adding batch normalization, the weights for the new model are loaded from an unknown .h5 file (/data/jhoward/ILSVRC2012_img/bn_do3_1.h5).

Sorry, I just realized that the “imagenet_batchnorm” notebook answers precisely this question.