How to deploy your model in production on android platform

i have mode a model using fastai library now i need to deploy it on android platform …any tips??

Export it using ONNX and then do the production with the Tensorflow Ecosystem.

Hi,

There are two ways.

  • You can use ONNX to export your model to the tensorflow and import them in the tesorflow.js or tensorflow lite, but I haven’t yet done it successfully for ResNet36 and I didn’t have time to look why.
    You can go through with this post: Fastai to browser pipeline and check if it’s working.

  • You can build a server (it can be even a CPU server) and by using JSON communication you can upload to the server input and get output. I describe this on my webpage how to do this with AWS Free-Tier (https://alexiej.github.io/checker/)

2 Likes

Right now you can do it by using Pytorch Mobile: Deep Learning: Run Pytorch+FastAI trained model on Android by usingPytorch Mobile (also applicable to iOS) | by Mariano | Jul, 2021 | Medium

It’s pretty straightforward but I wrote the post because maybe it helps someone without a very deep understanding like me.