Deployment Platform: Render ✅

colab/training platform
https://pastebin.com/MYDkmdXT

render
https://pastebin.com/X2TiJWtw

Hi andrew77 :smiley: :smiley:

The library versions that you trained on are different to the deployment ones.

I would suggest you start here, change your render requirements.txt libraries to match your deployment library version numbers.

If you search this forum, there was an issue with Pillow you may need version 6.x
nb! backup your work before making changes.

Cheers mrfabulous1 :smiley: :smiley:

1 Like

My solution, and I think it’ll work for everyone, is directly edit your file on GitHub instead of downloading it. It’s easy and no hassle with the versions.

@mrfabulous1,

Thanks for your prompt reply.

I tried modifying to the following but unsuccessful. I think it’s the mix and match thingy between fast.ai , torch and torchvision. Is there a ‘fail proof’ way to do so?

Thanks

Requirement.txt
aiofiles==0.4.0
aiohttp==3.5.4
asyncio==3.4.3
fastai==1.0.52
torch==1.4.0
torchvision == 0.5.0
numpy==1.16.3
pillow~=6.0
python-multipart==0.0.5
starlette==0.12.0
uvicorn==0.7.1

Error message
https://pastebin.com/FNBBt7Xp

Hi andrew77 :smiley:
You need to change the version of fastai also your reqirements.txt should equal your Colab file versions as these are the ones you trained your model on.

Cheers mrfabulous1 :smiley: :smiley:

thanks it’s working now.

sharing my requirements.txt for colab users.

aiofiles==0.4.0
aiohttp==3.5.4
asyncio==3.4.3
fastai==1.0.60
torch==1.4.0
torchvision == 0.5.0
numpy==1.16.3
pillow~=6.0
python-multipart==0.0.5
starlette==0.12.0
uvicorn==0.7.1
1 Like

Hi andrew77 Hooray :trophy:

Well done! remember you will have to go through this process every time you make a model as you never know if a library has had even a minor change which may break another library.

Doing these steps makes finding faults much easier.

Cheers mrfabulous1 :smiley: :smiley:

Hey guys, render works fantastic so far! Started with GCP but gave up after an hour. Render deploy went seamlessly. Need to work on the model but considering this is my first deployment ever, I am loving it!
https://house-plant-classifier.onrender.com/

1 Like

Hello folks, I would like to ask you a very silly question because I see everyone is doing a great job deploying apps.

Context: I tried to deploy mine, in fact I just changed the resulting model url in the server.py file but the application is the same, detect bears. I tried this because I wanted to see by myself it really worked before coding my own application.

Problem: If you go my site url, the site is all white, blank, nada. Should I wait some time before it deploys?, did I miss something?, I even added my cc number because I thought that was the reason.
I temporarily suspended the site to avoid charges.

More context:

  • The requirements.txt file was updated because I trained the model using Colab.
  • Render has permission to access my Github account.
  • I carefully followed the instructions in the production section, I don’t know what could be wrong.
  • I selected the starter plan. 512 MB, shared CPU.

Any ideas?, thank you for your help.

HI kuro_inu hope you are having wonderful day!

If you look at the majority of posts on this thread they all have an error which makes it easier to resolve the issue.
My suggestions would be:

  1. Test your repository on your local machine as you can see errors that don’t show up on render. (remember to deploy it using a virtual environment).
  2. Check the render console to make sure there are no errors.
    The console normally tells you wether it has been deployed successfully.
  3. I presume you mean the browser display is empty? you could check the browser console has no errors you are not seeing.

Hopefully if you do all the above you will find your error.

Cheers mrfabulous1 :smiley: :smiley:

Instead of downloading the model every time, is there a way to store it in render and access it directly?
If so how do I access the model stored in render?
Will storing it in render increase the response time?

Hi Johnyquest I hope you are having a wonderful day!

Instead of downloading the model every time, is there a way to store it in render and access it directly?

Create a copy of your current working repository for backup.
Add model to a directory in copy repository
Edit server.py to point to your model file in your repository files.
The lines below are the ones in question.

export_file_url = 'https://www.dropbox.com/s/6bgq8t6yextloqp/export.pkl?raw=1'
export_file_name = 'export.pkl'

If so how do I access the model stored in render?

?
Your model is accessed in the same way as it was when it was on
say google drive, its now being accessed locally due to server.py edit.

Will storing it in render increase the response time?

Not sure! as Jeremy would say try it.
time it.
Let us know on this forum, so others don’t have the same issues.

My guess? is it should be quicker as disc access is normally quicker than web access.

One point to note sometimes the code with the model is larger than the allowable disk space of the price plan you have purchased.
If this is the case, the above will not work. Some of the posts in this thread describe this problem.

Hope this helps

Cheers mrfabulous1 :grinning: :grinning:

1 Like

Thanks. Will try it

Hello Anurag,

This is a great guide you have prepared. This was my first web deployment and deep learning model as I am new to this but enjoying it a lot.
Here is a link to my web app for classifying sports image: https://sports-image-classifier.onrender.com/

so I am getting this error
ERROR: Exception in ASGI application
and
AttributeError: ‘Conv2d’ object has no attribute ‘padding_mode’
my pip list is this

-pencv-contrib-python 4.2.0.34
absl-py 0.9.0
alabaster 0.7.12
anaconda-client 1.7.2
anaconda-project 0.8.3
argh 0.26.2
asn1crypto 1.3.0
astor 0.8.0
astroid 2.4.0
astropy 4.0.1.post1
atomicwrites 1.4.0
attrs 19.3.0
autopep8 1.4.4
Babel 2.8.0
backcall 0.1.0
backports.shutil-get-terminal-size 1.0.0
bcrypt 3.1.7
beautifulsoup4 4.9.0
bitarray 1.2.1
bkcharts 0.2
bleach 3.1.4
blis 0.2.4
bokeh 2.0.2
boto 2.49.0
Bottleneck 1.3.2
certifi 2020.4.5.1
cffi 1.14.0
chardet 3.0.4
click 7.1.2
cloudpickle 1.4.1
clyent 1.2.2
colorama 0.4.3
comtypes 1.1.7
contextlib2 0.6.0.post1
contextvars 2.4
cryptography 2.9.2
cycler 0.10.0
cymem 2.0.2
Cython 0.29.17
cytoolz 0.10.1
dask 2.16.0
dataclasses 0.7
decorator 4.4.2
defusedxml 0.6.0
diff-match-patch 20181111
distributed 2.16.0
docutils 0.16
entrypoints 0.3
et-xmlfile 1.0.1
fastai 1.0.59
fastcache 1.1.0
fastprogress 0.1.22
filelock 3.0.12
flake8 3.7.9
Flask 1.1.2
fsspec 0.7.1
future 0.18.2
gast 0.3.3
gevent 1.4.0
glob2 0.7
greenlet 0.4.15
grpcio 1.27.2
h5py 2.10.0
HeapDict 1.0.1
html5lib 1.0.1
hypothesis 5.11.0
idna 2.9
imageio 2.8.0
imagesize 1.2.0
immutables 0.11
importlib-metadata 1.5.0
intervaltree 3.0.2
ipykernel 5.1.4
ipython 7.13.0
ipython-genutils 0.2.0
ipywidgets 7.5.1
isort 4.3.21
itsdangerous 1.1.0
jdcal 1.4.1
jedi 0.15.2
Jinja2 2.11.2
joblib 0.14.1
json5 0.9.4
jsonschema 3.2.0
jupyter 1.0.0
jupyter-client 6.1.3
jupyter-console 6.1.0
jupyter-core 4.6.3
jupyterlab 1.2.6
jupyterlab-server 1.1.1
Keras-Applications 1.0.8
Keras-Preprocessing 1.1.0
keyring 21.1.1
kiwisolver 1.2.0
lazy-object-proxy 1.4.3
libarchive-c 2.9
llvmlite 0.32.1
locket 0.2.0
lxml 4.5.0
Markdown 3.1.1
MarkupSafe 1.1.1
matplotlib 3.1.3
mccabe 0.6.1
menuinst 1.4.16
mistune 0.8.4
mkl-fft 1.0.15
mkl-random 1.1.0
mkl-service 2.3.0
mock 4.0.2
more-itertools 8.2.0
mpmath 1.1.0
msgpack 1.0.0
multipledispatch 0.6.0
murmurhash 1.0.2
nbconvert 5.6.1
nbformat 5.0.6
networkx 2.4
nltk 3.4.5
nose 1.3.7
notebook 6.0.3
numba 0.49.1
numexpr 2.7.1
numpy 1.18.1
numpydoc 0.9.2
olefile 0.46
opencv-contrib-python 4.2.0.34
opencv-python 4.2.0.34
openpyxl 3.0.3
packaging 20.3
pandas 1.0.3
pandocfilters 1.4.2
paramiko 2.7.1
parso 0.5.2
partd 1.1.0
path 13.1.0
pathlib2 2.3.5
pathtools 0.1.2
patsy 0.5.1
pep8 1.7.1
pexpect 4.8.0
pickleshare 0.7.5
Pillow 7.1.2
pip 20.0.2
pkginfo 1.5.0.1
plac 0.9.6
pluggy 0.13.1
ply 3.11
preshed 2.0.1
prometheus-client 0.7.1
prompt-toolkit 3.0.4
protobuf 3.11.4
psutil 5.7.0
py 1.8.1
pycodestyle 2.5.0
pycosat 0.6.3
pycparser 2.20
pycrypto 2.6.1
pycurl 7.43.0.5
pydocstyle 4.0.1
pyflakes 2.1.1
Pygments 2.6.1
pylint 2.5.0
PyNaCl 1.3.0
pyodbc 4.0.0-unsupported
pyOpenSSL 19.1.0
pyparsing 2.4.7
pyreadline 2.1
pyrsistent 0.16.0
PySocks 1.7.1
pytest 5.4.2
pytest-arraydiff 0.3
pytest-astropy 0.8.0
pytest-astropy-header 0.1.2
pytest-doctestplus 0.5.0
pytest-openfiles 0.5.0
pytest-remotedata 0.3.2
python-dateutil 2.8.1
python-jsonrpc-server 0.3.4
python-language-server 0.31.10
pytz 2020.1
PyWavelets 1.1.1
pywin32 227
pywin32-ctypes 0.2.0
pywinpty 0.5.7
PyYAML 5.3.1
pyzmq 18.1.1
QDarkStyle 2.8.1
QtAwesome 0.7.0
qtconsole 4.7.4
QtPy 1.9.0
requests 2.23.0
rope 0.17.0
Rtree 0.9.4
ruamel-yaml 0.15.87
scikit-image 0.16.2
scikit-learn 0.22.1
scipy 1.4.1
seaborn 0.10.1
Send2Trash 1.5.0
setuptools 46.4.0.post20200518
simplegeneric 0.8.1
singledispatch 3.4.0.3
six 1.14.0
snowballstemmer 2.0.0
sortedcollections 1.1.2
sortedcontainers 2.1.0
soupsieve 2.0
spacy 2.1.8
Sphinx 3.0.3
sphinxcontrib-applehelp 1.0.2
sphinxcontrib-devhelp 1.0.2
sphinxcontrib-htmlhelp 1.0.3
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 1.0.3
sphinxcontrib-serializinghtml 1.1.4
sphinxcontrib-websupport 1.2.1
spyder 4.1.3
spyder-kernels 1.9.1
SQLAlchemy 1.3.16
srsly 0.1.0
statsmodels 0.11.0
sympy 1.5.1
tables 3.6.1
tblib 1.6.0
tensorboard 1.14.0
tensorflow 1.14.0
tensorflow-estimator 1.14.0
termcolor 1.1.0
terminado 0.8.3
testpath 0.4.4
thinc 7.0.8
toml 0.10.0
toolz 0.10.0
torch 1.0.0
torchvision 0.2.1
tornado 6.0.4
tqdm 4.46.0
traitlets 4.3.3
typed-ast 1.4.1
typing-extensions 3.7.4.1
ujson 1.35
unicodecsv 0.14.1
urllib3 1.25.8
wasabi 0.2.2
watchdog 0.10.2
wcwidth 0.1.9
webencodings 0.5.1
Werkzeug 1.0.1
wheel 0.34.2
widgetsnbextension 3.5.1
win-inet-pton 1.1.0
win-unicode-console 0.5
wincertstore 0.2
wrapt 1.11.2
xlrd 1.2.0
XlsxWriter 1.2.8
xlwings 0.19.0
xlwt 1.3.0
yapf 0.28.0
zict 2.0.0
zipp 3.1.0

and the requirements file is like this

aiofiles==0.4.0
aiohttp==3.5.4
asyncio==3.4.3
fastai==1.0.52
https://download.pytorch.org/whl/cpu/torch-1.1.0-cp37-cp37m-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.3.0-cp37-cp37m-linux_x86_64.whl
numpy==1.16.3
pillow~=6.0
python-multipart==0.0.5
starlette==0.12.0
uvicorn==0.7.1

how do I fix this?

Hi obiwan I hope you are well and having a beautiful day!

The basic requirement to getting a model working on render.com is to make sure you match the library versions in requirements.txt to the libraries on the platform you trained your model on.

A quick look at your requirements.txt shows that your numpy version should be 1.18.1 not 1.16.1.

The error you have is normally associated with a mismatch with torch and torchvision.

The versions on colab are:
torch==1.0.0
torchvision==0.2.1

replace the https lines with the above in your requirements.txt.

Check the other files, if they are used on colab make sure they match.

There are many posts on this thread showing people doing just that.

Also make sure that you run your notebook first and get the latest pip_list that your model works with and do the above steps immediately. As often there are minor changes in the libraries made by the creators, which breaks the library, but the library version number has not changed, this happens quite often, and can take a long time to resolve.

Hope this helps.

Cheers mrfabulous1 :smiley: :smiley:

This works absolute fine. Thanks a lot!
Now can I change the text " Use images of teddy bears, black bears, grizzly bears, or all three!"

and how do I do that?

Hi Anurag,

i am currently following below mentioned link to create deploy web app on render.
https://course.fast.ai/deployment_render.html#customize-the-app-for-your-model

I have trained my image classification model on Google Colab. It is mentioned in the link that i need to update something in Server.py where can i find that file?

I am trying to deploy a image regression model on render ,but getting this error


Can anyone please help me out in this.

Hi venus hope all is well!

This error often means that there is a problem with your model or accessing your model.

As you are using a large model, check your application and model works locally first then redeploy it.

The the errors reported are often better on your local machine than on render.

Cheers mrfabulous1 :smiley: :smiley: