AIoT - FastAI on Edge computing devices?

Over the past few years several edge devices (AIoT) have been released that support various AI frameworks, unfortunately I have not found any that support FastAI.

Does anyone know if there is any way of running inference of a FastAI model on an edge device?

I know PyTorch released a mobile version but that is only for Android and iOS - so not true AIoT devices. Where as TF Lite and I think Caffe and YoLo will run on these low power embedded devices.

Was rather hoping that FastAI would support this area as it has huge practical implications.

Thanks
Tim