Fast.ai to pytorch

We are trying to use fast.ai in production. Because of the many dependencies, we want to convert a learner object into a pytorch model. The learner object has an attribute model, so normally this should be quite straightforward.

test_transforms = transforms.Compose([transforms.Resize(224),
transforms.ToTensor()
])

def predict_image(image):
image_tensor = test_transforms(image).float()
image_tensor = image_tensor.unsqueeze_(0)
input = Variable(image_tensor)
input = input.to(‘cuda’)
output = learn.model(input)
return output

However, the output is not the same as the output of learn.predict(). We are using the DynamicUnet model of Lesson 3. Thanks a lot!

Try, normalizing the input with imagenet_stats if you have used them during training, before passing it through the model. Also, do learn.model.eval() to turn-off dropout and batchnorm during forward-pass and use ‘with torch.no_grad()’ to turn-off backprop inorder to reduce computation during testing.

3 Likes

Thanks a lot! What is the best way to serialize the learn.model to make it completely independent of fast.ai? In some way or another torch.save(learn.model, "pytorch_model", pickle_module=dill) is still dependent on fast.ai.