Kubeflow fastai library Setup


Kubeflow seems like a great way to develop, serve, and maintain ml models.

I was wondering if anyone could share their experience/best practices for setting Kubeflow with the fastai library. Is there a docker image available? Is it simple to use?

1 Like

I use fastai with Kubeflow and Docker. You have to build your own container. Works like charm.

Thanks @ptrampert could you share a dockerfile as a reference? I tried using the official kaggle dockerfile which includes many libraries including fastai but it wouldn’t load correctly.

hello @zlapp, here is my Docker file I use to deploy FastAI apps. Maybe you can use it as a starting point.

To use GPUs you will need nvidia-docker2 otherwise you can just launch without the Nvidia runtime to use the CPU for inference.


Try here: