Fastai Model conversion to ONNX, TF, CoreML, Caffe2, TFlite

Here I would like to share a simple notebook as a walkthrough for model conversion.

Some notes:

  • TF to TFlite is not very mature when coming from PyTorch since sometimes operations can’t be expressed as native TF ops or TF lite only supports NHWC data format.
  • Fix is to just add a permute() to beginning of your model for converting NHWC to NCHW which can be used by the actual PyTorch model.
  • Other than TFLite - ONNX, TF, CoreML, Caffe2 seems to work fine.
  • Probably swift would cross out all the above frameworks since it runs on C/C++ (need to wait until the swift lessons :slight_smile: )? But I thought it would be nice to share if some people have a use case.

Notebook Showing Fastai Model to All Conversion

Here is the needed dependencies

Hope this helps someone!


Thanks, @kcturgutlu, this is very good work!

I recently purchased a USB Coral Edge TPU and was looking into ways to convert my fastai models to .tflite, so this is perfect timing. There are some restrictions as to what kind of architectures your model can use (source) and that might be the reason you have some problems with the conversion. I’ll test with Inception V4 or mobilenet and let you know how that goes.



Hope it will be helpful, in my experience onnx to TF is a bit stressful since it’s still experimental. So good luck!

@mnm403 would you please provide notebook for converting fast ai model to tf.lite