How to deploy a trained fasti.ai model in Python 3.5 environment?


(Alex Tat-Sang Choy) #1

I need to deploy a fast.ai model I trained in our environment which is Python 3.5 and I can’t upgrade the Python without affecting 10 other people’s previous work. So my options now are:

  1. modify fast.ai to work in Python 3.5 (not sure how much work it is)
  2. save fast.ai model as a pytorch model (*.pth):
    Tried, but learner.models.model are Sequential, which do not have a save method for torch.save.
  3. save fast.ai model (*.h5) and load it in pytorch:
    Seems that I need to reconstruct the model before loading it.
  4. other methods.

Any suggests may help greatly.
Thanks for reading!


(Alex Tat-Sang Choy) #2

Just a follow-up, I tried #1 and it turns out the amount of work is small, just change every f’’ format string into older form seems to work. Hope I didn’t miss any other Python 3.6 only features.


#5

I am interested in the 3rd option. Keep us posted


(Alex Tat-Sang Choy) #6

Follow up on 2, turn out this is really simple:

model = learn.model
torch.save(model, 'my_model')

That’s it, then it can be loaded in pytorch with

model = torch.load('my_model')

Just remember to use

model.eval()

for inference. (thx @cedric for correction)
Cheers!


(Cedric Chee) #7

Do you mean “for inference”?


(Alex Tat-Sang Choy) #8

ops. typos. thx.


(Alex Tat-Sang Choy) #9

Oh I forgot to mention things like image normalization could be different for different models, so will need to take care of. And also data.classes has to be stored and read. After all these, one gets more appreciate the convenience of fast.ai, which gets these details right for you behind the scene.