Deploying fastai model as REST API with BentoML

Hi guys,

I am really excited to share with you our open source project BentoML, now with fastai support!

BentoML is an open source framework for creating, shipping and running ML services

Just like fastai is making deep learning easier to use. BentoML is making ML service deployment easier to use for data scientists. From model in Jupyter notebook to ML service in production in 5 minutes.

You can try out fast ai example, notebook at Google Colab.

Let me know what you guys think. Love to get your feedback!

Cheers

Bo

6 Likes

I used it for bringing a model to production and it worked well!

1 Like

That’s awesome it works well for you. I am very happy it helped. Do you mind to share where you deploy it to? Is it your own server setup (like ec2 or kubernetes) or vendor services like Sagemaker?

We are about to do a major release soon that should speed up the service 10x-30x in performance and nice UI for management.

Feel free to reach if you have any questions. You can ping me here or find me in the BentoML slack group, we are pretty active.

Any plans to build in a fastai2 handler? Great work by the way.

1 Like

@danjjohns
Thank you for the encouragement!

Fastai 2 is already supported(https://github.com/bentoml/BentoML/blob/master/bentoml/artifact/fastai2_model_artifact.py) since BentoML 0.7.6. I will put up an example notebook of it in our gallery(GitHub.com/bentoml/gallery) soon.

2 Likes

Okay nice, looking forward to the example notebook. Might deploy a new fastai2 model with Bento again

hi man i really appreciate ur work and went to check the github docs but i wanted to see the colab notebook … however i cant open the link

Hi Shay

The link is broken, I will fix it. Meanwhile You can browse the example notebooks in Github.
You can find Fastai 2’s example notebook at https://github.com/bentoml/gallery/blob/master/fast-ai/fastai2_medical/medical_imaging.ipynb

1 Like

The link is not found @yubozhao

You may want to check out Gradio + HuggingFace spaces for deploying your model publicly.

Hello, thanks for point it out. We are in the process of releasing a new version and you can find the information about fast ai here: fast.ai - BentoML