Lesson3 - Can I change the p value of Dropout layer

Hello,

I have two questions on lesson 3 about the Dropout removing.
Thanks in advance.

Q1
To remove the dropout, instead of creating a new model, can I just change the model.layers.p of dropout layers of existing model? Will this work?

model.layers[34].p=0
model.layers[36].p=0
for layer in layers[31:]: layer.trainable=True
for l1,l2 in zip(model.layers[31:], fc_layers): l1.set_weights(proc_wgts(l2))
model.compile(optimizer=opt, loss=‘categorical_crossentropy’, metrics=[‘accuracy’])

Q2:
In Lesson_3_Notes I found these words:

“Everything in the lesson still applies and the rescaling of weights is still 100% accurate should we be applying classical dropout but through the inner workings of keras this step can be disregarded (if you do the rescaling you will end up with weights that are either too small or to large!)”

Does this mean the proc_wgts function used to half the weights should be removed?

Thanks,
Sean