How to load PyTorch .pth weights?

How to load PyTorch .pth weights?
I am using deep fashion dataset pre-trained weights with resnet 50 for attribute prediction.

But I am not sure how to load it in fastai. I am a newbie to fastai and learning right now. Any type of leads/ notebooks to refer will be deeply appreciated. Thank you.

1 Like

learner.load(deep_fashion_model.pth) should get you there, assuming your learner.model has the same architecture :slight_smile:

It looks like they use resnet or vgg so cnn_learner (as used here: will allow you to create a learner with a model of the correct architecture to load in your pre-trained weights

1 Like

Hi, tried learner, loaded vgg 16 pretrained model (latest.pth) got these errors.

learn = cnn_learner(data, models.vgg16_bn, metrics=accuracy)
learn = learn.load(’/content/latest’)

error(s) in loading state_dict for Sequential:
Missing key(s) in state_dict: “0.0.0.weight”, “0.0.0.bias”, “0.0.1.weight”, “0.0.1.bias”, “0.0.1.running_mean”, “0.0.1.running_var”, “0.0.3.weight”, “0.0.3.bias”, “0.0.4.weight”, “0.0.4.bias”, “0.0.4.running_mean”, “0.0.4.running_var”, “0.0.7.weight”, “0.0.7.bias”, “0.0.8.weight”, “0.0.8.bias”, “0.0.8.running_mean”, “0.0.8.running_var”, “0.0.10.weight”, “0.0.10.bias”, “0.0.11.weight”, “0.0.11.bias”, “0.0.11.running_mean”, “0.0.11.running_var”, “0.0.14.weight”, “0.0.14.bias”, “0.0.15.weight”, “0.0.15.bias”, “0.0.15.running_mean”, “0.0.15.running_var”, “0.0.17.weight”, “0.0.17.bias”, “0.0.18.weight”, “0.0.18.bias”, “0.0.18.running_mean”, “0.0.18.running_var”, “0.0.20.weight”, “0.0.20.bias”, “0.0.21.weight”, “0.0.21.bias”, “0.0.21.running_mean”, “0.0.21.running_var”, “0.0.24.weight”, “0.0.24.bias”, “0.0.25.weight”, “0.0.25.bias”, “0.0.25.running_mean”, “0.0.25.running_var”, “0.0.27.weight”, “0.0.27.bias”, “0.0.28.weight”, “0.0.28.bias”, “0.0.28.running_mean”, “0.0.28.running_var”, “0.0.30.weight”, “0.0.30.bias”, “0.0.31.weight”, “0.0.31.bias”, “0.0.31.running_mean”, “0.0.31.running_var”, “0.0.34.weight”, “0.0.34.bias”, “0.0.35.weight”, “0.0.35.bias”, “0.0.35.running_mean”, “0.0.35.running_var”, “0.0.37.weight”, “0.0.37.bias”, “0.0.38.weight”, “0.0.38.bias”, “0.0.38.running_mean”, “0.0.38.running_var”, “0.0.40.weight”, “0.0.40.bias”, “0.0.41.weight”, “0.0.41.bias”, “0.0.41.running_mean”, “0.0.41.running_var”, “1.2.weight”, “1.2.bias”, “1.2.running_mean”, “1.2.running_var”, "1…
Unexpected key(s) in state_dict: “meta”, “state_dict”, “optimizer”.
Is there any other way to proceed with this?


This could be from how the weights are stored away. IE fastai expects a fastai model, which has a opt and model key. Check out here where I show a snippet of code (look for def transfer_learn). And while it’s for v2, it works in v1 as well (a few things need to be changed for v1, specifically with the DataLoaders).

Hi, Tried your ipynb, I faced with the given error,
learn = transfer_learn(learn, ‘/home/jupyter/Deepfashion/fastai_set/models/latest’)

KeyError Traceback (most recent call last)
----> 1 learn = transfer_learn(learn, ‘/home/jupyter/Deepfashion/fastai_set/models/latest’)

in transfer_learn(learn, name, device)
5 if (learn.model_dir/name).with_suffix(’.pth’).exists(): model_path = (learn.model_dir/name).with_suffix(’.pth’)
6 else: model_path = name
----> 7 new_state_dict = torch.load(model_path, map_location=device)[‘model’]
8 learn_state_dict = learn.model.state_dict()
9 for name, param in learn_state_dict.items():

KeyError: ‘model’

1 Like

So given how it looks here, (sorry just now saw how this was saved), it looks like you should key into the state_dict of that model to actually grab the proper keys. If you’re coming from the same type of vgg it should copy over without throwing many headaches (if you get about 5 keys mismatching, disregard that’s just the custom head fastai has)