I managed to finally get fastai up and running on a Raspberry PI Zero - it’s not fast, but it seems to work.
I gave up trying to install the spacy dependency - best I understand it does’t really compile on arm - but that dependency is only needed for the fastai text package, so for vision it should work.
If you increase swap size on the raspberry, it actually works to compile pytorch on the raspberry itself, but the compilation will take hours and hours (I gave up after ~10 hours).
In the end I cross-compiled as much as possible on a desktop, and I build PyTorch with these flags:
NO_CUDA=1
NO_DISTRIBUTED=1
NO_MKLDNN=1
If you want to try it yourself, get my built python wheels here:
Make sure you get Python 3.6 on the raspberry first, I used Berryconda: https://github.com/jjhelmus/berryconda
Then install the wheels:
pip install numpy-1.17.2-cp36-cp36m-linux_armv6l.whl
pip install numexpr-2.7.0-cp36-cp36m-linux_armv6l.whl
pip install Bottleneck-1.2.1-cp36-cp36m-linux_armv6l.whl
pip install Pillow-6.1.0-cp36-cp36m-linux_armv6l.whl
pip install torch-1.2.0a0+8554416-cp36-cp36m-linux_armv6l.whl
pip install torchvision-0.4.0a0+d31eafa-cp36-cp36m-linux_armv6l.whl
And finally
pip install fastai --no-deps
Inference time with a resnet34 model is around 52s per image on a Raspberry Pi zero - not exactly fast, but enough for my use-case 