Split sequential model question

Hello, I split the convolutional layers out of a model that I had, and made a new sequential model out of it. However, when I add new layers, and it turns from being a Sequential model type into a 'NoneType" – and I can’t do anything else with it. It doesn’t matter what kind of layer I am trying to add.

Has anyone else experienced this problem? Am I doing something wrong? Here is my code and the results I get. I get correct outputs the first time I print the model type and summary, but as soon as I add any new layers everything changes to a “NoneType”. I tried adding different layer types – same thing happens every time.


model = Sequential(conv_layers)
model = model.add(BatchNormalization(axis=3))


<class ‘keras.models.Sequential’>

Layer (type) Output Shape Param #

input_44 (InputLayer) (None, None, None, 3) 0

block1_conv1 (Conv2D) (None, None, None, 64) 1792

block1_conv2 (Conv2D) (None, None, None, 64) 36928

block1_pool (MaxPooling2D) (None, None, None, 64) 0

block2_conv1 (Conv2D) (None, None, None, 128) 73856

block2_conv2 (Conv2D) (None, None, None, 128) 147584

block2_pool (MaxPooling2D) (None, None, None, 128) 0

block3_conv1 (Conv2D) (None, None, None, 256) 295168

block3_conv2 (Conv2D) (None, None, None, 256) 590080

block3_conv3 (Conv2D) (None, None, None, 256) 590080

block3_pool (MaxPooling2D) (None, None, None, 256) 0

block4_conv1 (Conv2D) (None, None, None, 512) 1180160

block4_conv2 (Conv2D) (None, None, None, 512) 2359808

block4_conv3 (Conv2D) (None, None, None, 512) 2359808

block4_pool (MaxPooling2D) (None, None, None, 512) 0

Total params: 7,635,264.0
Trainable params: 0.0
Non-trainable params: 7,635,264.0

<class ‘NoneType’>

AttributeError Traceback (most recent call last)
in ()
13 model = model.add(BatchNormalization(axis=3))
14 print(type(model))
—> 15 model.summary()
17 #model.compile(optimizer=Adam(), loss=‘categorical_crossentropy’, metrics=[‘accuracy’])

AttributeError: ‘NoneType’ object has no attribute ‘summary’

What am I doing wrong here?

Thanks, Christina