Dogscats-ensemble.ipynb weights error

question 1:

for i in range(5):
i = str(i)
model = train_last_layer(i)
train_dense_layers(i, model)

when I ran the above codes, I got the following error:

ValueError Traceback (most recent call last)
in ()
2 i = str(i)
3 model = train_last_layer(i)
----> 4 train_dense_layers(i, model)

in train_dense_layers(i, model)
5 for l1,l2 in zip(fc_model.layers, fc_layers):
6 weights = l2.get_weights()
----> 7 l1.set_weights(weights)
8 fc_model.compile(optimizer=Adam(1e-5), loss=‘categorical_crossentropy’,
9 metrics=[‘accuracy’])

D:\Anaconda2\lib\site-packages\keras\engine\topology.pyc in set_weights(self, weights)
973 str(len(params)) +
974 ’ weights. Provided weights: ’ +
–> 975 str(weights)[:50] + ‘…’)
976 if not params:
977 return

ValueError: You called set_weights(weights) on layer “batchnormalization_3” with a weight list of length 0, but the layer was expecting 4 weights. Provided weights: []…

I checked the codes given in the github carefully. And I think that something is wrong with the network architecture.

def train_last_layer(i): In this function: It has model.pop(); model.pop(); model.pop() and I deleted one of model.pop()
def get_fc_layers(p, in_shape): In this funciton, I deleted the first BatchNormalization(),

Then , the code runs successfully. I wonder if it is right that I changed the code.

question 2:

network1:
MaxPooling2D(input_shape=in_shape),
Flatten(),
Dense(4096, activation=‘relu’),
BatchNormalization(),
Dropout§,
Dense(4096, activation=‘relu’),
BatchNormalization(),
Dropout§,
Dense(2, activation=‘softmax’)

network2:
MaxPooling2D(input_shape=in_shape),
Flatten(),
Dense(4096, activation=‘relu’),
Dropout§,
Dense(4096, activation=‘relu’),
Dropout§,
Dense(2, activation=‘softmax’)

Can these two networkds share the same weights?