[Lesson 7] Questions around the ResNet MNIST notebook


I was working through the notebook lesson7-resnet-mnist and observed a something that I couldn’t understand, and I would really appreciate if someone could guide me through that.

In the first basic CNN with batchnorm model that is created, we didn’t use a ReLU function before flattening the output however in subsequent refactorings a final ReLU block is added. This is confusing me as I believe a final ReLU block before Flatten() is not required - is this understanding correct?

Thank you

bumping this up!