The new Fastai framework requires torch 1.7
coremltools does not work with torch version above 1.6.
My model is created using the new Fastai (Spring 2020) and now I cannot convert it to a CoreML model (to be used in iOS).
This link describes the problem with coremltools and torch version 1.7:
Is it possible to train the network using the latest Fastai and then load it using torch 1.6 so that it can be converted to coreML using coremltools 4.0?
You can get just the model weights using
learn.save(). Then you should be able to load it in a PyTorch model that is defined in the same way. Just note that you will have to replicate the fastai inference code in pure PyTorch. I think if you are just using the model weights, training with the latest version of fastai will work.
You should be able to install coremltools=3.4 which lets you convert to coreml to ‘old fashion’ way.
Or you can use the automated solution:
https://course.fast.ai/deployment_seeme_ai which automatically converts to coreml, onnx and tflite.
Update (but need to verify): The issue seems to be fixed:
09:49PM - 27 Oct 20 UTC
09:59PM - 22 Dec 20 UTC
coremltools unified conversion API fails with traced model from PyTorch v1.7.0. It was fine with PyTorch v1.6.0.
Traceback (most recent call last):
Thanks. That’s a 3rd party service which is going to charge soon after their beta is over.
Thanks. I ended up using the head branch from coremltool repo which is compatible with PyTorch 1.7
!pip install -Uqq git+https://github.com/apple/coremltools.git@master
Thanks for the feedback.
I’m aware of the “
NB” added to the docs page. I’m the founder of SeeMe.ai, so I can inform you that although there is no official pricing yet, I’m working hard on providing a generous free tier.
Glad that worked for you, I’ll try to test the coremltools 4.0 from pip which I mentioned…