I am trying to deploy a fastai trained resnet34 model into production by exporting as PyTorch model and later loading as PyTorch model in a production environment. I was suggested to use
torch.save(learn.model.state_dict(), params_path) instead of
torch.save(learn.model, path) to export model.
When I tried to load using the following snippets.
model = models.resnet34()
However, It failed to load due to mismatch in keys throwing the following error.
RuntimeError: Error(s) in loading state_dict for ResNet: Missing key(s) in state_dict: Unexpected key(s) in state_dict:
and Also the model which I trained is not exactly resnet34 as fastai replaces the output layer with the one that suits our problem.
Due to the above reasons, I decided to export both model architecture and parameters from fastai learner object.
model = learn.model torch.save(model,'fastai-model-arch.pth') #model architecture torch.save(model.state_dict(), 'Stage-Resnet34-torch.pth') # model weights
Loading into production enviornment
model = torch.load(‘fastai-model-arch.pth’) #load model architecutre
model.load_state_dict(torch.load(‘Stage-Resnet37-torch.pth’)) #load model parameters
<All keys matched successfully> response back.
My question, Is it fine to export the model the way I did? or does this method have any drawbacks?. I am asking because I don’t see anyone suggesting this solution in the forums.
Looking forward to a positive response.