Using batch-normalization and dropout with Fast.ai library

I have worked through Jeremy’s dogs vs cats notebook but I am unable to find how to apply dropout and batch normalization with learn = ConvLearner.pretrained(arch, data, precompute = True)

I have used the corresponding functions in Tensorflow and Keras but I want to use fast.ai library for a Kaggle Competition.

Thank you very much for the help!

You’ll need to disable precomputation. Then you get dropout and batchnorm for free. Search the forums for details on setting the dropout rate, or watch the lessons where we cover that.

2 Likes