Torchserve Anyone?

Anyone try out Torchserve with a fastai2 model? I’ve managed to get torchserve working we pretrained models, but not one from fastai2. I’ve tried with pytorch model and torchscript and suspect a custom handler is the fix (I’ll update this post with those results). It looks very promising by way of serving fastai2 models.

https://pytorch.org/serve/

4 Likes

Have you gotten torchserve to work with fastaiv1? If you can provide any info regarding the same it would be appreciated. ULMFiT is the model I am trying to use torchserve for.

1 Like

Not yet, but still trying

1 Like

Here is an example of using TorchServe host FastAI model

1 Like

I’m trying to include Fastai as an option in my company but we are struggling with deploying a fastai model with TorchServer which is needed in our architecture.
The main problem, I think, is the step of saving the weights, load in a pytorch model and then, when I try to save the.mar:

ValueError: Modules that have backward hooks assigned can't be compiled: TracedModule[Sequential](original_name=Sequential)

Any clues?..

1 Like