Using saved model for reuse

Suppose I’m using a saved model for prediction. What is the correct way to create the learn object ? How to initialize it ?

tfmModel = tfms_from_model(resnet34, imSize, aug_tfms = transforms_side_on, max_zoom = 1.1)
data = ImageClassifierData.from_paths(PATH, tfms = tfmModel, bs = batchSize)
learn = ConvLearner.pretrained(resnet34, data)

I am doing it like above, but if I have to take my saved model to someplace else and make predictions out of it, do I need to take the training data with it too ? as I understand from

ImageClassifierData.from_paths(PATH, ...)

What I am trying to do is, export the model I built in a format which I will be able to use somewhere else. Like we use the ResNet34 and similar ImageNet models to train our network.

1 Like

if you take a look at save() from learner.py:

def save(self, name):
save_model(self.model, self.get_model_path(name))
if hasattr(self, ‘swa_model’): save_model(self.swa_model, self.get_model_path(name)[:-3]+’-swa.h5’)

and then save_model() / load_model() from torch_imports.py:

def save_model(m, p): torch.save(m.state_dict(), p)
def load_model(m, p): m.load_state_dict(torch.load(p, map_location=lambda storage, loc: storage))

you should be able to load the state directly into the model without building a Learner first (I haven’t tried it yet, but i dont see why it wouldn’t work)

from torch_imports import load_model
load_model(resnet34, path_to_your_saved_model)

or something along those lines

1 Like