Deploying to Heroku (is so easy)

i deployed my lesson 2 classifier to binder the other day but for some reason binder has been on it’s knees for 2 days running like a dog.

i had a look at heroku and i’m actually much happier. here is a quick deployment guide for anyone who cares.

that’s my demo project website. requirements.txt + Procfile + notebooks

  • requirements.txt - needs editing with all the modules you need. the pytorch versions in mine are the little cpu only versions for deployment. voila is for rendering your notebooks as webpages.
  • Procfile - tells heroku what to do when your app starts up. if you want to display a particular notebook you can add the notebookname.ipynb to the end of the string in my Procfile but it will only display that notebook, no others…i want to have a “homepage” with links to different demos as i add more so i have a default.ipynb notebook which links to others and i hand out the link as https://joedockrill.herokuapp.com/voila/render/default.ipynb instead of https://joedockrill.herokuapp.com/ which just displays a list of notebooks for you to pick from.
  • add some notebooks to the repo
  • go to heroku.com, create an account, point it at your repo and press deploy.
12 Likes

Hi joedockrill I hope you are having a beautiful weekend!

I tried Heroku about a year ago and it wasn’t clear or easy!

Thanks for posting your notebooks and tips it will be helpful to many.

Cheers mrfabulous1 :smile: :smile:

1 Like

Hi

sachin93 hope all is well! Are the steps to installing your app on heroku.com difficult or easy.

Could you give us some details please, there are many people who do not deploy their models as even $7.00 a month on render.com is a lot if you don’t have it.

Cheers mrfabulous1 :grinning: :grinning:

1 Like

@mrfabulous1 I found them to be easy. Yes I don’t have money/credit card so I wasn’t able to use Render. I have added the details on my GitHub Read me Doc. I’ll still add the details here :

  1. I used the starter code for deploying fast.ai on Render. For that repository and use that template to create your own repository like I created for my Pokemon Classifier.
  2. We need to add a Procfile to the repository and put web: python app/server.py serve in it.
  3. We need to add a runtime.txt file to specify python version. You can use my repository for your reference.
  4. Use the export.pkl file you generated and upload it on Google Drive and get a shareable link(Anyone can view and not restricted) and use this link generator to get the link and add it under export_file_url. You also need to change classes = ['black', 'grizzly', 'teddys'] to classes = ['bulbasaur', 'charmander', 'pikachu'] as in my case. You need to change according to your classes. In the server.py file we need to add
import os 
import requests
Port = int(os.environ.get('PORT', 50000))
export_file_url = 'https://drive.google.com/uc?export=download&id=1ntlwwv3Ao3kLJ_VaXgI6Gx3_01HG1Zbz'
export_file_name = 'export.pkl'
  1. Replace uvicorn.run
uvicorn.run(app=app, host='0.0.0.0', port=5000, log_level="info")

by

uvicorn.run(app=app, host='0.0.0.0', port=Port, log_level="info")
  1. Change the index.html file as it having the bear app details. Update it according to your need. This will represent what will show to the user when they open your application.
  2. Once your GitHub repository is ready. Create an account on Heroku.
  3. Create a new app. Give a name. Connect your GitHub repository. Choose Automatic Deploy. And then Manual Deploy.

P.S. - I made my own dataset using the process described in Lesson 2. I tried this on Kaggle datasets, it doesn’t work on them. Apparently, according to application logs ‘/kaggle’ is read only and hence there was an error although my build had succeeded. As I am new to all this I don’t know the workaround. I thought, I’ll try it on a non kaggle(my own dataset) and it worked.

Reach out to me in case of any queries.

Best,
Sachin Chaturvedi

3 Likes

Can you please show from where we have to change the WEB_CONCURRENCY in heroku app. I am not able to find it.

@mmd

Click on settings, then go down click on Reveal Config Vars add WEB_CONCURRENCY to KEY and VALUE to 1. Should work.

1 Like

Thanks for helping but i am still getting the same error.

@mmd Check the application logs. Top right side, right next to Open App you have More. Click there and check application logs.

1 Like


Still not working. From the logs i found it is not able unzip export.pkl but i dont know how to fix it.

pytorch 1.6 saves as a zip file, sounds like you’ve managed to update pytorch without updating fastai.

you could either

pip install fastai --upgrade

and export your model again if you’ve got the .pth file saved, or use pytorch 1.6 in your requirements.txt

3 Likes

Thanks a lot for your help. It finally worked! @joedockrill

1 Like

I am deploying my Pneumonia detection webapp usin heroku but encountered an error of slug size how to deal with the slug size error

-----> Python app detected
-----> Installing python-3.6.12
-----> Installing pip 20.1.1, setuptools 47.1.1 and wheel 0.34.2
-----> Installing SQLite3
-----> Installing requirements with pip
Collecting Flask~=1.1.2
Downloading Flask-1.1.2-py2.py3-none-any.whl (94 kB)
Collecting gunicorn
Downloading gunicorn-20.0.4-py2.py3-none-any.whl (77 kB)
Collecting numpy~=1.19.3
Downloading numpy-1.19.5-cp36-cp36m-manylinux2010_x86_64.whl (14.8 MB)
Collecting torch-nightly==1.0.0.dev20181129
Downloading https://download.pytorch.org/whl/nightly/cpu/torch_nightly-1.0.0.dev20181129-cp36-cp36m-linux_x86_64.whl (69.2 MB)
Collecting fastai==1.0.61
Downloading fastai-1.0.61-py3-none-any.whl (239 kB)
Collecting itsdangerous>=0.24
Downloading itsdangerous-1.1.0-py2.py3-none-any.whl (16 kB)
Collecting Werkzeug>=0.15
Downloading Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
Collecting Jinja2>=2.10.1
Downloading Jinja2-2.11.2-py2.py3-none-any.whl (125 kB)
Collecting click>=5.1
Downloading click-7.1.2-py2.py3-none-any.whl (82 kB)
Collecting Pillow
Downloading Pillow-8.1.0-cp36-cp36m-manylinux1_x86_64.whl (2.2 MB)
Collecting dataclasses; python_version < “3.7”
Downloading dataclasses-0.8-py3-none-any.whl (19 kB)
Collecting pyyaml
Downloading PyYAML-5.3.1.tar.gz (269 kB)
Collecting spacy>=2.0.18; python_version < “3.8”
Downloading spacy-2.3.5-cp36-cp36m-manylinux2014_x86_64.whl (10.4 MB)
Collecting requests
Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)
Collecting nvidia-ml-py3
Downloading nvidia-ml-py3-7.352.0.tar.gz (19 kB)
Collecting torchvision
Downloading torchvision-0.8.2-cp36-cp36m-manylinux1_x86_64.whl (12.8 MB)
Collecting bottleneck
Downloading Bottleneck-1.3.2.tar.gz (88 kB)
Installing build dependencies: started
Installing build dependencies: finished with status ‘done’
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status ‘done’
Preparing wheel metadata: started
Preparing wheel metadata: finished with status ‘done’
Collecting pandas
Downloading pandas-1.1.5-cp36-cp36m-manylinux1_x86_64.whl (9.5 MB)
Collecting packaging
Downloading packaging-20.8-py2.py3-none-any.whl (39 kB)
Collecting numexpr
Downloading numexpr-2.7.2-cp36-cp36m-manylinux2010_x86_64.whl (469 kB)
Collecting torch>=1.0.0
Downloading torch-1.7.1-cp36-cp36m-manylinux1_x86_64.whl (776.8 MB)
Collecting scipy
Downloading scipy-1.5.4-cp36-cp36m-manylinux1_x86_64.whl (25.9 MB)
Collecting beautifulsoup4
Downloading beautifulsoup4-4.9.3-py3-none-any.whl (115 kB)
Collecting fastprogress>=0.2.1
Downloading fastprogress-1.0.0-py3-none-any.whl (12 kB)
Collecting matplotlib
Downloading matplotlib-3.3.3-cp36-cp36m-manylinux1_x86_64.whl (11.6 MB)
Collecting MarkupSafe>=0.23
Downloading MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl (27 kB)
Collecting wasabi<1.1.0,>=0.4.0
Downloading wasabi-0.8.0-py3-none-any.whl (23 kB)
Collecting plac<1.2.0,>=0.9.6
Downloading plac-1.1.3-py2.py3-none-any.whl (20 kB)
Collecting cymem<2.1.0,>=2.0.2
Downloading cymem-2.0.5-cp36-cp36m-manylinux2014_x86_64.whl (35 kB)
Collecting blis<0.8.0,>=0.4.0
Downloading blis-0.7.4-cp36-cp36m-manylinux2014_x86_64.whl (9.8 MB)
Collecting preshed<3.1.0,>=3.0.2
Downloading preshed-3.0.5-cp36-cp36m-manylinux2014_x86_64.whl (126 kB)
Collecting tqdm<5.0.0,>=4.38.0
Downloading tqdm-4.55.1-py2.py3-none-any.whl (68 kB)
Collecting srsly<1.1.0,>=1.0.2
Downloading srsly-1.0.5-cp36-cp36m-manylinux2014_x86_64.whl (184 kB)
Collecting thinc<7.5.0,>=7.4.1
Downloading thinc-7.4.5-cp36-cp36m-manylinux2014_x86_64.whl (1.1 MB)
Collecting murmurhash<1.1.0,>=0.28.0
Downloading murmurhash-1.0.5-cp36-cp36m-manylinux2014_x86_64.whl (20 kB)
Collecting catalogue<1.1.0,>=0.0.7
Downloading catalogue-1.0.0-py2.py3-none-any.whl (7.7 kB)
Collecting certifi>=2017.4.17
Downloading certifi-2020.12.5-py2.py3-none-any.whl (147 kB)
Collecting idna<3,>=2.5
Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting chardet<5,>=3.0.2
Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
Collecting urllib3<1.27,>=1.21.1
Downloading urllib3-1.26.2-py2.py3-none-any.whl (136 kB)
Collecting python-dateutil>=2.7.3
Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz>=2017.2
Downloading pytz-2020.5-py2.py3-none-any.whl (510 kB)
Collecting pyparsing>=2.0.2
Downloading pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting typing-extensions
Downloading typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)
Collecting soupsieve>1.2; python_version >= “3.0”
Downloading soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting kiwisolver>=1.0.1
Downloading kiwisolver-1.3.1-cp36-cp36m-manylinux1_x86_64.whl (1.1 MB)
Collecting cycler>=0.10
Downloading cycler-0.10.0-py2.py3-none-any.whl (6.5 kB)
Collecting importlib-metadata>=0.20; python_version < “3.8”
Downloading importlib_metadata-3.3.0-py3-none-any.whl (10 kB)
Collecting six>=1.5
Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting zipp>=0.5
Downloading zipp-3.4.0-py3-none-any.whl (5.2 kB)
Building wheels for collected packages: pyyaml, nvidia-ml-py3, bottleneck
Building wheel for pyyaml (setup.py): started
Building wheel for pyyaml (setup.py): finished with status ‘done’
Created wheel for pyyaml: filename=PyYAML-5.3.1-cp36-cp36m-linux_x86_64.whl size=402155 sha256=7f955822d3655a1fe5a0077f2abe640bb8b38eca9330f89ce007eaab813b5f76
Stored in directory: /tmp/pip-ephem-wheel-cache-eqh4v8ri/wheels/e5/9d/ad/2ee53cf262cba1ffd8afe1487eef788ea3f260b7e6232a80fc
Building wheel for nvidia-ml-py3 (setup.py): started
Building wheel for nvidia-ml-py3 (setup.py): finished with status ‘done’
Created wheel for nvidia-ml-py3: filename=nvidia_ml_py3-7.352.0-py3-none-any.whl size=19189 sha256=27487daab20aeb3ff5414065c2e609123ea766e5a274149c51fa7440a567bda2
Stored in directory: /tmp/pip-ephem-wheel-cache-eqh4v8ri/wheels/7f/26/a3/33f2079871e2bebb3f53a2b21c3ec64129b8efdd18a6263a52
Building wheel for bottleneck (PEP 517): started
Building wheel for bottleneck (PEP 517): finished with status ‘done’
Created wheel for bottleneck: filename=Bottleneck-1.3.2-cp36-cp36m-linux_x86_64.whl size=331061 sha256=af5eb39acf4608f6d42d6df77e0b56f564bc3a4801254e03f2ecaeb4d205505e
Stored in directory: /tmp/pip-ephem-wheel-cache-eqh4v8ri/wheels/f7/a7/14/9be836efed01ac0eb3c125ac006c143b55ebf689269877d0e8
Successfully built pyyaml nvidia-ml-py3 bottleneck
Installing collected packages: itsdangerous, Werkzeug, MarkupSafe, Jinja2, click, Flask, gunicorn, numpy, torch-nightly, Pillow, dataclasses, pyyaml, wasabi, plac, cymem, blis, murmurhash, preshed, certifi, idna, chardet, urllib3, requests, tqdm, srsly, typing-extensions, zipp, importlib-metadata, catalogue, thinc, spacy, nvidia-ml-py3, torch, torchvision, bottleneck, six, python-dateutil, pytz, pandas, pyparsing, packaging, numexpr, scipy, soupsieve, beautifulsoup4, fastprogress, kiwisolver, cycler, matplotlib, fastai
Successfully installed Flask-1.1.2 Jinja2-2.11.2 MarkupSafe-1.1.1 Pillow-8.1.0 Werkzeug-1.0.1 beautifulsoup4-4.9.3 blis-0.7.4 bottleneck-1.3.2 catalogue-1.0.0 certifi-2020.12.5 chardet-4.0.0 click-7.1.2 cycler-0.10.0 cymem-2.0.5 dataclasses-0.8 fastai-1.0.61 fastprogress-1.0.0 gunicorn-20.0.4 idna-2.10 importlib-metadata-3.3.0 itsdangerous-1.1.0 kiwisolver-1.3.1 matplotlib-3.3.3 murmurhash-1.0.5 numexpr-2.7.2 numpy-1.19.5 nvidia-ml-py3-7.352.0 packaging-20.8 pandas-1.1.5 plac-1.1.3 preshed-3.0.5 pyparsing-2.4.7 python-dateutil-2.8.1 pytz-2020.5 pyyaml-5.3.1 requests-2.25.1 scipy-1.5.4 six-1.15.0 soupsieve-2.1 spacy-2.3.5 srsly-1.0.5 thinc-7.4.5 torch-1.7.1 torch-nightly-1.0.0.dev20181129 torchvision-0.8.2 tqdm-4.55.1 typing-extensions-3.7.4.3 urllib3-1.26.2 wasabi-0.8.0 zipp-3.4.0
-----> Discovering process types
Procfile declares types -> web
-----> Compressing…
! Compiled slug size: 1.2G is too large (max is 500M).
! See: http://devcenter.heroku.com/articles/slug-size
! Push failed

1 Like

@Gaurav20 Heroku has a size limit of 500 MB. Yours is 1.2 GB.

How to reduce the Heroku size ?

1 Like

Hi Gaurav20 hope all is well and you are having a wonderful day!

Maybe this link will help resolve your problem.

Cheers mrfabulous1 :smiley: :smiley:

2 Likes

I’m having trouble with heroku and the learner.predict() method. I’m basically running the bears classifier lesson but with my own image set etc. it’s working perfectly on paperspace and locally, in a notebook and on voila but when I deploy to heroku the following line seems to crash the ‘on_click’ function
pred,pred_idx,probs = learn_inf.predict(img)
I narrowed it down to this line via trial and error with print statements.
I’ve tried changing versions in the requirements.txt file and adding other packages as suggested in various forum posts. But i just can’t seem to get any predictions showing on heroku.
any ideas would be very much appreciated. code below

from fastai.vision.all import *
from fastai.vision.widgets import *
path = Path()
learn_inf = load_learner(path/'facesMarch12.pkl', cpu=True)
btn_upload = widgets.FileUpload()
out_pl = widgets.Output()
lbl_pred = widgets.Label()
lbl_prob = widgets.Label()
def on_click_classify(change):
    img = PILImage.create(btn_upload.data[-1])
    out_pl.clear_output()
    with out_pl: display(img.to_thumb(128,128))
    pred,pred_idx,probs = learn_inf.predict(img)  # this line seems to crash this function
    lbl_pred.value = ('Prediction: {}'.format(pred))
    lbl_prob.value = ('Probability: {:0.4f}'.format(probs[pred_idx]))
btn_upload.observe(on_click_classify, names=['data'])
display(VBox([widgets.Label('Select your face image'), btn_upload, out_pl, lbl_pred, lbl_prob])) 
1 Like

I struggled with the same thing for a while. This is what worked for me:
requirements.txt:

https://download.pytorch.org/whl/cpu/torch-1.7.1%2Bcpu-cp38-cp38-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.8.0-cp38-cp38-linux_x86_64.whl
fastai=2.1.8
voila
ipywidgets

runtime.txt:

python-3.8.8

If it still does not work, I’d suggest looking at the versions in your local environment and try to recreate this as closely as possible with your requirements/runtime file.

1 Like

How to Deploy Fast.ai Models? (Voilà , Binder and Heroku)

Medium article:

code:

I hope this may help.

Hi, new user to fast.ai - tried deploying to both binder and heroku, but it always came up with this screen:


My github repo is here: Any help would be much appreciated!

deploy logs seem alright for both heroku and binder, can’t figure out why this is taking so long (it’s always stuck at 16/33) - https://bear-classifier-dan.herokuapp.com/