Hi @jeremy! Since so many people want to do this, I’ve created a simple library that serves a FastAI Learner object and gives a nice web-based UI with a single line of code: https://github.com/aakashns/servefastai
Here’s a demo:
Currently it’s more a proof-of-concept for playing around with the model and passing custom test images, and only works for single-label image classification but I’m hoping to make it production ready by the end of the course.
Would be great to implement something similar as a part of a standard pipeline, like, inside of Jupyter notebook. Though it is probably more difficult. I am thinking about something similar to bokeh that uses JS to integrate various fancy plots into a notebook.
@aakashns Probably we can come up with a notebook extension, or just use HTML() function from Jupyter to render right into the notebook? Or using magic like %show_learner?
Not sure which approach is better. Maybe some CLI stuff also? Though probably fastai team already has something in mind.
Also, I think your solution is a bit different than the original question.
AFAIU, the original question was about how to load a trained fastai model from disk with minimum dependencies to serve the model for inference in production pipelines.
But your thing is serving a model/learner already in available in memory I think. Needless to say what you have done is super useful too!
Also, I started looking at your PyTorch screencasts. They are super awesome. Very clear explanations to understand the PyTorch API. Can wait to binge the rest of the videos
Yo @simonw … do you know of any way to configure things so zeit spins up the machine faster? I’ve deployed a dockerized flask app and the requests usually timeout while the server spins up. I’ve also seen that you, and a few others, are using starlette.io, and I was wondering how that compares with flask (especially w/r/t performance).
Anyways, any tips of making my API spin up quicker and choosing a framework to serve the API would be greatly appreciated.
Zeit’s v2 platform spins up a lot faster… but it comes at a nasty cost: your image must be less than 100MB. If you’re bundling a 85MB model that’s likely impossible. There’s a discussion about that issue here: https://github.com/zeit/now-cli/issues/1523
The easier option is to tell Zeit to always run that particular instance - then you won’t have to pay the startup cost. You can do that using the scale command: https://zeit.co/blog/scale - the free plan lets you have up to 3 instances running at a time, and you can pay them money for more.
Yeah, I’m pretty sure it can be embedded into Jupyter itself using the HTML() function. Haven’t tried it yet though. Don’t know much about Jupyter extensions though.
Lecture 2 shows a great example of how to build a widget for IPython notebook. I guess your app could be converted into widget then. There are a lot of pre-built HTML controls in the ipywidgets package.
Great idea! I had the exact same thought after watching the FileDeleter example. I’ll try and submit a pull request to the fastai library’s widgets section.
Yes, that’s correct @MicPie Thank you. Now I think I need to learn more seriously Pytorch. There are many basic things that I don’t know. I have just started with Udacity Pytorch course for that
Hi, Jeremy,
In the dev_nb/104c_single_image_pred.ipynb file img = open_image(get_image_files(path)[0]) learn.predict(img)
The "img: is a original image.Do I need to transform and normalize that image?