Why do we add batchnorm1d in cnn_learner while doing transfer learning?

Can anyone please explain why cnn_learn function adds BatchNorm 1d layers after base architecture’s last Convolutional layer. What is the use of adding BatchNorm 1d.