Turn off eval mode for JIT exported model

(Austin) #1

I’ve trained a Fastai model and saved/loaded it as a PyTorch JIT model. I’d like to experiment with Monte Carlo Dropout, so I’m curious about how to turn off eval mode so that my prediction outputs are non-deterministic.

I’m using the resnet18 architecture which has dropout layers, but when I do this:
model = model.train()

The predictions I get from calling my model like this are still deterministic:
predict_values = model(img_tensor)

Can anyone spot what I’m missing?

Edit: Found my answer. “modes will always behave as if it is in the mode it was in during tracing, no matter which mode the ScriptModule is in.”

0 Likes