Here I would like to share a simple notebook as a walkthrough for model conversion.
Some notes:
TF to TFlite is not very mature when coming from PyTorch since sometimes operations can’t be expressed as native TF ops or TF lite only supports NHWC data format.
Fix is to just add a permute() to beginning of your model for converting NHWC to NCHW which can be used by the actual PyTorch model.
Other than TFLite - ONNX, TF, CoreML, Caffe2 seems to work fine.
Probably swift would cross out all the above frameworks since it runs on C/C++ (need to wait until the swift lessons )? But I thought it would be nice to share if some people have a use case.
I recently purchased a USB Coral Edge TPU and was looking into ways to convert my fastai models to .tflite, so this is perfect timing. There are some restrictions as to what kind of architectures your model can use (source) and that might be the reason you have some problems with the conversion. I’ll test with Inception V4 or mobilenet and let you know how that goes.