Puting the Model Into Production: Web Apps

If there is a good tutorial about how to put your model into the internet using dockerfile or anything else please share the link here. I haven’t found any and I’m not even sure where I should start.

4 Likes

I don’t know about dockerfiles, but the most straightforward host for me to work with has been pythonanywhere.com. You’d have to upgrade to the $12/month plan to be able to store 5 GB of files.

Thanks Mauro. That seems cool site and maybe I will use it. Can someone with knowledge of dockerfiles tell us why dockerfiles might be better option? I don’t like price of this service. I’m also worried that in case of my software gets millions of users how this site could handle it.

Unfortunately the approach that @simonw demonstrated only works on v1. AFAIK there’s no examples yet of a way to deploy on v2 without spending a lot of money.

2 Likes

I’ll try to create a blog post.

v1 is the only supported way for large images like this.

A free plan has support for 3 concurrent instances.
So, it’s possible to configure a deployment to run always. (It’ll never sleep)

2 Likes

Looking forward to learning how to do this!

Will v1 API stick around, or will it be removed at some point?

but on free plan we can’t upload more then 5mb, and models are around 80mb
maybe squeezenet would be below 5mb, but it is not supported at this moment.

Don’t get me wrong, I just bought premium version, I think it’s worth it, but it’s misleading for someone looking for cheapest option.

1 Like

That’s a good point. But I think we can upload the model to somewhere else and continue.

Yeah. It will stick around.

1 Like

Nice! I had fun playing with the digit drawing one (and looking through the code).

Hi, I have used @simonw 's Starlette code as reference, and modified it to deploy on heroku.
Along with that I have also made the input and output pages more user friendly. I am also converting the output to softmax to display the probabilities.
Github Repo: https://github.com/nikhilno1/healthy-or-not
My app to identify whether the food is healthy or junk is running here:
https://healthy-or-not.herokuapp.com/

Will share a detailed README to deploy on heroku shortly.

22 Likes

Here’s a detailed write-up for deploying on Heroku. Let me know if you face any issue. Thanks.

15 Likes

I bought 1 core CPU and 1 GB of ram EC2 instance from AWS because I got it free. Now I’m having some trouble. I run following codes on terminal.

sudo apt update
sudo apt upgrade
sudo apt install python3
sudo apt install python3-pip
pip3 install starlette

Everything seemed to work just fine until I run this line.

pip3 install fastai

First I got out of memory error. I searched from internet what I should do and I found this great trick.

pip3 install fastai --no-cache-dir

This solved memory error but now I got different kind of error message.

I didn’t found anything from internet. I think this is again caused by low memory. I though that if this is true could fastai be changed. It might be good to be available to install fastai also these low memory devices because not everyone is planing to get hundreds of users same time. So is only solution to get more memory? Can I somehow buy it for a short time so I can install these packages and then again change to free version?

Yes, it is due to memory problem. You can try GCP which will give you $300 credit and will easily last you the course. I got OOM on 1.7 GB system. It worked fine when I bumped it to the next available one which was 4 GB.

I might use that but I just have some Amazon stocks so if there is some way to make this work I stay with Amazon.

I used your code (and successfully made a container-yay) and I’m a bit stuck on the deploy to Heroku part. But first, why is snap or snapd needed?

Does anyone know how to get auto-reloading to work with starlette?

When running my application locally it doesn’t pick up changes unless I restart the server.

@arunoda Hi, I’ve tried using now. Then i’ve tried running the example and I’m getting this output

now scale webearbears-cool.now.sh sfo 1

Fetched deployment “zeit-pu60kymsy.now.sh” [164ms]
Error! Cannot scale a deployment containing builds

Then when I visit the link the URL takes so long to do the classification. In addition, the folder containing the model is empty and yet I’ve followed the steps.

Interesting.
Could you ask this question https://zeit.co/chat ?
So, we can have more people to help.

By looking at your URL, I can see that you’ve used the our lambda support which we release today. It’ll need some more work:

Use these fields on your now.json file:

{
  "features": {"cloud": "v1"},
  "version": 1
}