Loading a saved model in a notebook with no internet

Hi all,
I have the following issue. I created a model by calling:

learn = unet_learner(dls, resnet34 , metrics=Dice())

and then I trained it. I saved it via callbacks so now I have a pth file.
In a completely new session of the notebook, I have to call:

learn = unet_learner(dls, resnet34 , metrics=Dice())
learn.load(MODEL_NAME)

so that everything (data loaders, the model etc.) is exactly the same.
The problem is that the first line of code, downloads resnet from the internet and this particular kaggle competition, I try to participate in, does not allow internet access to the notebook.
Is there a workaround for this pickle? (pun intended :grinning:)

You are not specific, but I presume you created the pickle file using… load.export()
My understanding is separating traiing and inference works like this…

Training notebook:

learn = unet_learner(dls, resnet34 , metrics=Dice())
learn.fine_tune(n)
learn.export(fname='export.pkl')

Inference notebook:

learn = load_learner('export.pkl')
pred = learn.predict(data)

Is there anything lacking from that general arrangement for your case?

Note that pickle serialization in Python saves the names of functions, not the code itself. Therefore, any custom code you have for models, data transformation, loss function etc… should be put in a module that you will import in your training environment before exporting, and import in your inference environment before loading the model.

2 Likes

If you want an example of using fastai in a Kaggle competition, you may take a look at one of my notebooks here:

It loads model and weights, and does inference without internet access. But I think that @bencoman gave you a right tip and using the load_learner function should help. Essentially, all I did was copy-pasting required code and use that function here.

3 Likes

Thank you very much both of you. The export, load_learner combination works as a charm!

I was originally confused, because due to the SaveModelCallback, the parameters were saved in .pth format.
That been said, it is possible to work though, with a .pth file in inference, as shown here in this notebook [Inference] - FastAI Baseline, but you would have to recreate the backbone of the architecture that requires the ResNet part + all the rest of the model manually.

2 Likes