Batch norm

In the fastai code base right now, RELU activation is used before batchnorm, not after. There isn’t a definitive answer for this, but in practice it seems to be better. It’s a while ago but here is Jeremy take on it: Questions about batch normalization

2 Likes