New guide for easy web app deployment

Have never seen this thing previously but sounds interesting. Definitely worth to try. Is it something built on top of Git? Looks like a Python package specifically tailored to help with ML-related types of files processing.

Can we run opencv inside this web app created using now services and zeit?

Hi all.

@Jeremy , I use another way to deploy my guitar classifier app for cheap. I’m using Dokku on a VPC instance. This gives you one to many always-on apps for about 3$ a month…

Blog post is here:

8 Likes

I followed the guide, but the app’s not working for me. The build fails. Followed the guide exactly as it was written. I used the download link for dropbox and changed the classes in the server.py file. Can someone help me out? Something to do with “invalid load key”.

Building…
Sending build context to Docker daemon 36.86kB
—> ffafb5882b66
Step 2/9 : RUN apt update
—> Using cache
—> 818132902904
Step 3/9 : RUN apt install -y python3-dev gcc
—> Using cache
—> 1f3b479f51ea
Step 4/9 : ADD requirements.txt requirements.txt
—> Using cache
—> Running in 14b0e07be2c2
Traceback (most recent call last):
File “app/server.py”, line 37, in
learn = loop.run_until_complete(asyncio.gather(*tasks))[0]
File “/usr/local/lib/python3.6/asyncio/base_events.py”, line 473, in run_until_complete
return future.result()
File “app/server.py”, line 32, in setup_learner
learn.load(model_file_name)
File “/usr/local/lib/python3.6/site-packages/fastai/basic_train.py”, line 204, in load
self.model.load_state_dict(torch.load(self.path/self.model_dir/f’{name}.pth’, map_location=device))
File “/usr/local/lib/python3.6/site-packages/torch/serialization.py”, line 358, in load
return _load(f, map_location, pickle_module)
File “/usr/local/lib/python3.6/site-packages/torch/serialization.py”, line 519, in _load
magic_number = pickle_module.load(f)
_pickle.UnpicklingError: invalid load key, ‘<’.
Error! Build failed

1 Like

From their site:

“DVC runs on top of any Git repository and is compatible with any standard Git server or provider (Github, Gitlab, etc). Data file contents can be shared by network-accessible storage or any supported cloud solution. DVC offers all the advantages of a distributed version control system — lock-free, local branching, and versioning.”

“Use S3, Azure, GCP, SSH, SFTP, rsync or any network-attached storage to store data. The list of supported protocols is constantly expanding.”

I will definitely check it out when I find the time…

I’m pretty sure that means your model URL is incorrect. Try using it locally with wget and then open the file that gets downloaded to check it’s correct.

4 Likes

Note: the download link is the one which starts the file download directly—and is normally different than the share link which presents you with a view to download the file (use https://rawdownload.now.sh/ if needed)

Did you do this step @Mauro ?

If you used dropbox, the sharelink normally ends with ?dl=0
You ll have to change that to ?raw=1
(or use https://rawdownload.now.sh/)

1 Like

I used the download link itself. Not the share link. It’s different than the link that one gets from rawdownload.now.sh. But it’s working fine for me now using that site to convert the share link. Thanks.

1 Like

It was the url. I should have used https://rawdownload.now.sh/. I was using the download link explicitly.

I’m currently running into this exact pickling issue with the same invalid load key (<).

The dropbox url for the model has been converted to the raw download version. Any advice for debugging this?

Edit: there was already a model file in the app/models directory. I needed to delete it for things to work.

Although a new issue has arisen. I’m trying to deploy a super resolution model that returns an image as an output. The standard Zeit template is designed to return a JSONResponse. If my final output is either a torch tensor or a fastai Image object, what is the best way to return that?

1 Like

@sgugger @lesscomfortable Help
I am trying to deploy my app on zeit, but it continues to give me errors. I am using the same code that is provided in the deployment guide.

If any of you are struggling with cloud providers for deployment, I’d love for you to try Render. The guide for fastai-v3 is here: https://course-v3.fast.ai/deployment_render.html

We don’t have any size restrictions on Docker images, and I’m around to answer questions and help with debugging. (I’m the founder/CEO of Render and previously built Crestle).

9 Likes

Hello @miwojc. I thought very interesting your post. Indeed, the option “Not recognized” must be given I think (or “Other” if you prefer) in order to use the “Classifier in Real Life”.

Following the idea of @martinmm, this is the code I use when I build a classifier with 2 classes:

def predict_label(img, ths, model=learn):
    
    # Condition function
    def cond(idx):
        return (indice == idx) and (value >= ths[idx])
    
    # Get prediction values
    cat, indice, preds = model.predict(img)
    value, _ = torch.max(preds, 0)
    indice = indice.item()
    value = value.item()

    if cond(0) or cond(1):
        print(cat)
    else:
        print('Not recognized') 

Then, you type predict_label(img, ths) to get the label of class 0, class 1 or the label “Not recognized” for an image classified by the model.

Note: the argument ths is a list [th0, th1]. th0 and th1 are the thresholds of classes 0 and 1 above which the class label is given. You need to analysis your predictions to search the best values for theses thresholds.

Thanks for the post @anurag.

Can you give us an idea of how your service compares with zeit? I’ve been happy with the later thus far but have found their messaging confusing regarding docker support and so forth. Would be looking to explore this for both personal and professional projects.

I tried deploy my model with zeit but i didn’t :slight_smile: I have mac and i am using now app on computer.Also i couldn’t upgrade.:persevere:

The biggest difference is that we plan to offer first class Docker support for as long as developers are using Docker, and our Docker offering is only going to get better (we’re going to introduce Docker persistence so you can run Wordpress or MySQL on Render).

We also don’t have restrictions on Docker image sizes, unlike Zeit where you have to $50/month to go over 100MB, and $200/month to go over 500MB. With Render, you’ll just pay $5/month.

2 Likes

Yasin: feel free to sign up for Render with code fastai-v3.

@anurag Thank u i will try.I want to make skin cancer classification.I trained model but i couldn’t deploy yet.how can i solve problem at zeit?

I don’t think you can since they’ve closed v1 off to new signups from what I can tell.

@anurag Render is really good and easy but i have a problem again.Have any idea? :slightly_smiling_face: