BatchNorm when retraining VGG model: change mean?

Just finishing up lesson 3 where Jeremy talks about Batch Normalization, and looking at the hard-coded mean, I wondered if it would be better to adjust those numbers/means to update with the new datasets being applied. Is this necessary? If so, is it easy?

1 Like

Good question @tastingsilver, I would also like to know what to do about the mean image during transfer learning.

In the meantime, can you try with both the approaches and see how the loss/accuracy behave?