What to do with batch-normalization layers while fine-tuning the model

I do not know the answer to your question, but some people share there experiences and thoughts regarding a related question on the following chat: Freezing batch norm

1 Like