I am really excited to share with you our open source project BentoML, now with fastai support!
BentoML is an open source framework for creating, shipping and running ML services
Just like fastai is making deep learning easier to use. BentoML is making ML service deployment easier to use for data scientists. From model in Jupyter notebook to ML service in production in 5 minutes.
You can try out fast ai example, notebook at Google Colab.
Let me know what you guys think. Love to get your feedback!
That’s awesome it works well for you. I am very happy it helped. Do you mind to share where you deploy it to? Is it your own server setup (like ec2 or kubernetes) or vendor services like Sagemaker?
We are about to do a major release soon that should speed up the service 10x-30x in performance and nice UI for management.
Feel free to reach if you have any questions. You can ping me here or find me in the BentoML slack group, we are pretty active.
Hello, thanks for point it out. We are in the process of releasing a new version and you can find the information about fast ai here: fast.ai - BentoML