We’re using PyTorch Lightning, and so far it’s suitable for longer research projects. However, I want to speed up making PoCs (Proof of Concepts), which I think FastAI is more suitable for.
The deployment infrastructure however relies on TorchScript (because there is also C++ involved, and we just have to pass this file to deployment team to get the model running).
I know that FastAIs standpoint is that Julia (or was it Swift?) is better suited for this, instead of limiting Python’s structure to remove the GIL, but that would be a too big project to undertake, and I’m not familiar with Julia/Swift.
In PyTorch Lightning, you can get a TorchScript version of the model in 1 line of code: torch.jit.save(model.to_torchscript(), "model.pt")
What would I need to do in FastAI?
p.s. I searched this forum, but only a few topics touched on TorchScript, but none discussed it in detail.
Since fastai models are torch models, you can just export to TorchScript. Now, we’re not PyTorch lightning, so base PyTorch doesn’t have the fancy model.to_torchscript() functionality. Instead we need to do it ourselves. Below is a full end-to-end example:
Finally, at the time of writing this, fastai has made some modifications to the Tensor class in PyTorch to have it working better with tensor subclasses. As a result we have our own version of requires_grad_, but for it to work with jit we need to have it accept a requires_grad param. Below is the monkey-patch to get there: