PyTorch at the Edge: Deploying Over 964 TIMM Models on Android with TorchScript and Flutter

Hello Fastai friends, I wrote a blog about how to pick a model from TIMM, train them with Fastai and deploy them on Android, for free.

Here’s the TLDR on Twitter -

Here’s the full blog post -

And the GitHub repo -


In this blog post :

you export model , pkl extension
then few lines later u use pt extension :slight_smile:
ex :
How do I covert my pkl exported model to Torchscript not pt extension ?

Hello @bahman_apl I used the following codes to export to torchscript. As you can see the model is taken from the Learner with learn.model

import torch
from torch.utils.mobile_optimizer import optimize_for_mobile

example = torch.rand(1, 3, 224, 224)
traced_script_module = torch.jit.trace(learn.model, example)
optimized_traced_model = optimize_for_mobile(traced_script_module)

You can get the model from the .pkl file by using the load_learner function.

learner = load_learner('export.pkl')

The load_learner function automatically loads the exported model, data, and other necessary information to create the Learner object. Once loaded, you can use it to make export models, make predictions, fine-tune the model, and perform other tasks.

1 Like

For me first I should load model

learner = load_learner('export.pkl')

then do this :
continue to the end and
Last line I should name my model whatever I like and use for predictions ?

Yes that’s right :+1:

1 Like