Deployment Platform: Render ✅

Hi goldenflatulence
Well done!

mrfabulous1 :smiley::smiley:

1 Like

Hi, guys!
I have next problem: when deploy teddybears - everything is fine. As soon as I change export.pkl file and labels - get net error in log console

Any ideas how to deal with that problem?

Hi volcanoflash hope you are well!

  1. I have next problem: when deploy teddybears - everything is fine.

Do you mean you have deployed the teddy repository and it works fine on Render?

  1. As soon as I change export.pkl file and labels

Do you mean that you have trained a new model and created a export.pkl file on a different platform like your desktop or a service provider like Google Colab?
and you have put your export.pkl on google shared drive.

    • get net error in log console

You get get the error when you have done 1. and 2. and try to deploy the app.

If you have done exactly what I describe in steps 1,2 and 3 a good start would be to see posts in this thread which contain ‘pip list’ and do as they say.

Cheers mrfabulous1 :smiley::smiley:

if your answer two both tho

1 Like

Does anyone know what the URI is for torchvision 0.2.1? It’s the version I used to export my pkl on Paperspace and I can’t seem to find it on the Pytorch website. I’m having the same issues with the teddy bear classifier repo and version numbers not matching up

Hi korlandril hope all is well!

I haven’t found a specific URL for torchvision 0.2.1

However here are some things you can try,

  1. pip install torchvision==0.2.1

or you can add the following line

  1. RUN pip install torchvision==0.2.1

After the requirements line in the file called “Dockerfile” located in the repository.

  1. As there are lots of dependencies, in some cases where my model was trained months previous, I have just had to retrain it on the current versions of fastai and update all the associated libraries etc in my requirements.txt. In this case this is usally the version installed on Google Colab.

One app uses the following requirements.txt however sometimes minor changes are made in one library, which affects another library and its not always documented, so I have quite a few versions of requirements.txt.


Hope this helps mrfabulous1 :smiley::smiley: `


Hi, I’m getting the same error message as several people above:

Sep 12 09:04:31 AM  ERROR: Exception in ASGI application
Sep 12 09:04:31 AM  Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/", line 378, in run_asgi
    asgi = app(self.scope)
TypeError: __call__() missing 2 required positional arguments: 'receive' and 'send'
Sep 12 09:04:31 AM  INFO: ('', 58920) - "GET /favicon.ico HTTP/1.1" 500

I’ve edited my requirements.txt, but when I run ! pip list, the list does not contain entries for starlette, aiofiles, or aoihttp. I’ve tried leaving those blank, and as a shot in the dark also tried the values in the most recent post above.

Any insight as to whether their absence from my pip list causing this error, or whether it might be something else would be much appreciated!

Hi go_go_gadget hope all is well!

I’ve edited my requirements.txt, but when I run ! pip list , the list does not contain entries for starlette, aiofiles, or aoihttp. I’ve tried leaving those blank, and as a shot in the dark also tried the values in the most recent post above.

What platform did you train your model on?

Any insight as to whether their absence from my pip list causing this error, or whether it might be something else would be much appreciated!

Can you show the !pip list of the platform you trained the model on?

Also the client part of the app is written using the starlette library, without this it is impossible for your app to work.

I believe aiofiles, is required if you want to use FileResponse or StaticFiles .
and aoihttp is an Async http client/server framework (asyncio).

We will need to resolve these issues, then it will either work or we will resolve any other issues if there are any

Cheers mrfabulous :smiley::smiley:

So much for the response, @mrfabulous1! I’m using GCP, and here’s my !pip list:

Package Version

alabaster 0.7.12
anaconda-client 1.7.2
anaconda-project 0.8.3
ansiwrap 0.8.4
arrow 0.14.5
asn1crypto 0.24.0
astroid 2.2.5
astropy 3.2.1
atomicwrites 1.3.0
attrs 19.1.0
Babel 2.7.0
backcall 0.1.0
backports.os 0.1.1
backports.shutil-get-terminal-size 1.0.0
bcolz 1.2.1
beautifulsoup4 4.8.0
binaryornot 0.4.4
bitarray 1.0.1
bkcharts 0.2
bleach 3.1.0
bokeh 1.3.4
boto 2.49.0
Bottleneck 1.2.1
cachetools 3.1.1
certifi 2019.6.16
cffi 1.12.3
chardet 3.0.4
Click 7.0
cloudpickle 1.2.1
clyent 1.2.2
colorama 0.4.1
conda 4.7.11
conda-package-handling 1.3.11
configparser 3.8.1
confuse 1.0.0
contextlib2 0.5.5
cookiecutter 1.6.0
cryptography 2.7
cycler 0.10.0
cymem 2.0.2
Cython 0.29.13
dask 2.3.0
dataclasses 0.6
datalab 1.1.5
ddt 1.2.1
decorator 4.4.0
defusedxml 0.6.0
dill 0.2.9
distributed 2.3.0
docker 4.0.2
docutils 0.15.2
entrypoints 0.3
enum34 1.1.6
et-xmlfile 1.0.1
fairing 0.5.3
fastai 1.0.57
fastcache 1.1.0
fastprogress 0.1.21
filelock 3.0.12
Flask 1.1.1
fsspec 0.4.0
future 0.17.1
gcsfs 0.3.0
gevent 1.4.0
gitdb2 2.0.5
GitPython 3.0.1
glob2 0.7
gmpy2 2.0.8
google-api-core 1.14.2
google-api-python-client 1.7.11
google-auth 1.6.3
google-auth-httplib2 0.0.3
google-auth-oauthlib 0.4.0
google-cloud-bigquery 1.18.0
google-cloud-core 1.0.3
google-cloud-dataproc 0.5.0
google-cloud-datastore 1.9.0
google-cloud-language 1.3.0
google-cloud-logging 1.12.1
google-cloud-monitoring 0.31.1
google-cloud-spanner 1.10.0
google-cloud-storage 1.18.0
google-cloud-translate 1.6.0
google-resumable-media 0.3.2
googleapis-common-protos 1.6.0
greenlet 0.4.15
grpc-google-iam-v1 0.12.3
grpcio 1.23.0
h5py 2.9.0
heapdict 1.0.0
html5lib 1.0.1
htmlmin 0.1.12
httplib2 0.13.1
idna 2.8
imageio 2.5.0
imagesize 1.1.0
importlib-metadata 0.19
ipykernel 5.1.2
ipython 7.7.0
ipython-genutils 0.2.0
ipython-sql 0.3.9
ipywidgets 7.5.1
isort 4.3.21
itsdangerous 1.1.0
jdcal 1.4.1
jedi 0.15.1
jeepney 0.4
Jinja2 2.10.1
jinja2-time 0.2.0
joblib 0.13.2
json5 0.8.5
jsonschema 3.0.2
jupyter 1.0.0
jupyter-aihub-deploy-extension 0.1
jupyter-client 5.3.1
jupyter-console 6.0.0
jupyter-contrib-core 0.3.3
jupyter-contrib-nbextensions 0.5.1
jupyter-core 4.5.0
jupyter-highlight-selected-word 0.2.0
jupyter-http-over-ws 0.0.6
jupyter-latex-envs 1.4.6
jupyter-nbextensions-configurator 0.4.1
jupyterlab 1.0.2
jupyterlab-git 0.8.1
jupyterlab-server 1.0.0
keyring 18.0.0
kiwisolver 1.1.0
kubernetes 10.0.1
lazy-object-proxy 1.4.1
libarchive-c 2.8
lief 0.9.0
llvmlite 0.29.0
locket 0.2.0
lxml 4.4.1
Markdown 3.1.1
MarkupSafe 1.1.1
matplotlib 3.1.0
mccabe 0.6.1
missingno 0.4.2
mistune 0.8.4
mkl-fft 1.0.14
mkl-random 1.0.2
mkl-service 2.0.2
mock 3.0.5
more-itertools 7.2.0
mpmath 1.1.0
msgpack 0.6.1
multipledispatch 0.6.0
murmurhash 1.0.2
nb-conda 2.2.1
nb-conda-kernels 2.2.2
nbconvert 5.6.0
nbdime 1.1.0
nbformat 4.4.0
nbpresent 3.0.2
networkx 2.3
nltk 3.4.4
nose 1.3.7
notebook 6.0.0
notebook-executor 0.1
numba 0.45.1
numexpr 2.7.0
numpy 1.16.4
numpydoc 0.9.1
oauth2client 4.1.3
oauthlib 3.1.0
olefile 0.46
openpyxl 2.6.2
packaging 19.1
pandas 0.25.0
pandas-profiling 2.3.0
pandocfilters 1.4.2
papermill 1.1.0
parso 0.5.1
partd 1.0.0 12.0.1
pathlib2 2.3.4
patsy 0.5.1
pep8 1.7.1
pexpect 4.7.0
phik 0.9.8
pickleshare 0.7.5
Pillow 6.1.0
Pillow-SIMD 6.0.0.post0
pip 19.1.1
plac 0.9.6
plotly 4.1.0
pluggy 0.12.0
ply 3.11
poyo 0.5.0
preshed 2.0.1
prettytable 0.7.2
prometheus-client 0.7.1
prompt-toolkit 2.0.9
protobuf 3.9.1
psutil 5.6.3
ptyprocess 0.6.0
py 1.8.0
pyarrow 0.14.1
pyasn1 0.4.6
pyasn1-modules 0.2.6
pycodestyle 2.5.0
pycosat 0.6.3
pycparser 2.19
pycrypto 2.6.1
pydot 1.4.1
pyflakes 2.1.1
Pygments 2.4.2
pylint 2.3.1
pyodbc 4.0.27
pyOpenSSL 19.0.0
pyparsing 2.4.2
pyrsistent 0.14.11
PySocks 1.7.0
pytest 5.0.1
pytest-arraydiff 0.3
pytest-astropy 0.5.0
pytest-doctestplus 0.3.0
pytest-openfiles 0.3.2
pytest-pylint 0.14.1
pytest-remotedata 0.3.2
python-dateutil 2.8.0
pytz 2019.2
PyWavelets 1.0.3
PyYAML 5.1.2
pyzmq 18.1.0
QtAwesome 0.5.7
qtconsole 4.5.3
QtPy 1.9.0
regex 2018.1.10
requests 2.22.0
requests-oauthlib 1.2.0
retrying 1.3.3
rope 0.14.0
rsa 4.0
ruamel-yaml 0.15.46
scikit-image 0.15.0
scikit-learn 0.21.2
scipy 1.3.1
seaborn 0.9.0
SecretStorage 3.1.1
Send2Trash 1.5.0
setuptools 41.0.1
simplegeneric 0.8.1
six 1.12.0
smmap2 2.0.5
snowballstemmer 1.9.0
sortedcollections 1.1.2
sortedcontainers 2.1.0
soupsieve 1.9.2
spacy 2.0.18
Sphinx 2.1.2
sphinxcontrib-applehelp 1.0.1
sphinxcontrib-devhelp 1.0.1
sphinxcontrib-htmlhelp 1.0.2
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 1.0.2
sphinxcontrib-serializinghtml 1.1.3
sphinxcontrib-websupport 1.1.2
spyder 3.3.6
spyder-kernels 0.5.1
SQLAlchemy 1.3.7
sqlparse 0.3.0
statsmodels 0.10.1
sympy 1.4
tables 3.5.2
tblib 1.4.0
tenacity 5.1.1
terminado 0.8.2
testpath 0.4.2
textwrap3 0.9.2
thinc 6.12.1
toolz 0.10.0
torch 1.2.0
torchvision 0.4.0a0+6b959ee
tornado 5.1.1
tqdm 4.34.0
traitlets 4.3.2
typed-ast 1.4.0
typing 3.6.4
ujson 1.35
unicodecsv 0.14.1
uritemplate 3.0.0
urllib3 1.24.2
virtualenv 16.7.3
wcwidth 0.1.7
webencodings 0.5.1
websocket-client 0.56.0
Werkzeug 0.15.5
wheel 0.33.4
whichcraft 0.6.0
widgetsnbextension 3.5.1
wrapt 1.10.11
wurlitzer 1.0.3
xlrd 1.2.0
XlsxWriter 1.1.8
xlwt 1.3.0
zict 1.0.0
zipp 0.5.2

Thanks again!

Hi go_go_gadget hope you are having a jolly day.

In my experience the first thing you must do is make sure your all the libraries listed in your requirements.txt of your repository, are set to the same version as in GCP !pip list.

So your requirements.txt should look similar to the one in post 273 but but have the version numbers of your GCP.

I use google Colab so I have similar issues.

Also If you trained your model months ago it may be worth running it again as the libraries sometimes change but the version number doesn’t.

Let me know the outcome.
If you have done this and its not working let me know the error.

Cheers mrfabulous1 :smiley::smiley:

1 Like

Hi @anurag, will you consider adding other payment options in the future?

Certainly, but we don’t have an estimate for when. Which option works best for you? PayPal?

Paypal would be great.:slightly_smiling_face:

1 Like

Yes, please add paypal support asap! thanks!

Hi, JamesT, I have successfully deploy fastai code by following instruction from “Deploying on Google App Engine” , but ast last steps, I get error “502 Bad Gateway” from “nginx”. Do you know why? Thanks first!

1 Like

Hi, @mrfabulous1! Thanks again for your help.

Here’s my requirements.txt:


I’ve matched the values to those in my !pip list as above, but my list doesn’t contain entries for starlette, aoifiles, or aiohttp. Perhaps I need to install these?

I only trained this model on Thursday, so the libraries are likely the same.

Hi go_go_gadget hope you had a good weekend.

If you started with the current Teddy Bear repository on Github the latest requirements.txt is as follows.


I have ammended it for your GCP versions of fastai and numpy.

I suggest you use the requirements.txt above as yours doesn’t have values for startlette, python-multipart, aiofiles, aiohttp and asyncio setting is missing completely. You should not have to install any libraries, this is being done by Docker. Check against your GCP settings if any libraries have higher versions then you only need change those numbers.

Once again please send a copy of any error you get and the requirements.txt you are using on, when you reply. With so many inconsistencies in your requirements.txt at the moment, its very difficult to resolve the issue, we must get this right.

Hope this helps.

mrfabulous1 :smiley::smiley:


Thank you, @mrfabulous1! I sincerely appreciate all of your help.

The app is rendering now! It’s displaying the text for the teddy bear model, however, though it’s correctly running my classifier (Picasso vs. Monet).

Here’s a screenshot:

I think I must need to edit the code displaying the text, but I can’t tell from the file which part of the code to edit (I’m sorry, I’m very inexperienced!).

Here’s the file:

from starlette.applications import Starlette
from starlette.responses import HTMLResponse, JSONResponse
from starlette.staticfiles import StaticFiles
from starlette.middleware.cors import CORSMiddleware
import uvicorn, aiohttp, asyncio
from io import BytesIO

from fastai import *
from import *

export_file_url = ‘
export_file_name = ‘export.pkl’

classes = [‘picasso’, ‘monet’]
path = Path(file).parent

app = Starlette()
app.add_middleware(CORSMiddleware, allow_origins=[’*’], allow_headers=[‘X-Requested-With’, ‘Content-Type’])
app.mount(’/static’, StaticFiles(directory=‘app/static’))

async def download_file(url, dest):
if dest.exists(): return
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
data = await
with open(dest, ‘wb’) as f: f.write(data)

async def setup_learner():
await download_file(export_file_url, path/export_file_name)
learn = load_learner(path, export_file_name)
return learn
except RuntimeError as e:
if len(e.args) > 0 and ‘CPU-only machine’ in e.args[0]:
message = “\n\nThis model was trained with an old version of fastai and will not work in a CPU environment.\n\nPlease update the fastai library in your training environment and export your model again.\n\nSee instructions for ‘Returning to work’ at”
raise RuntimeError(message)

loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(setup_learner())]
learn = loop.run_until_complete(asyncio.gather(*tasks))[0]

def index(request):
html = path/‘view’/‘index.html’
return HTMLResponse(

@app.route(’/analyze’, methods=[‘POST’])
async def analyze(request):
data = await request.form()
img_bytes = await (data[‘file’].read())
img = open_image(BytesIO(img_bytes))
prediction = learn.predict(img)[0]
return JSONResponse({‘result’: str(prediction)})

if name == ‘main’:
if ‘serve’ in sys.argv:, host=‘’, port=5042)

Link to web app

Thanks again for your time!

1 Like

Hi go_go_gadget hope you are having a jolly day!

I am glad to hear that your model is now working!

To change the text in the html page, edit the index.html page in the view directory.


Have a wonderful evening.

mrfabulous1 :smiley::smiley:


Yay! It’s working! Thank you again, so very much!


1 Like

Hi go_go_gadget
You’re Welcome!
mrfabulous1 :smiley::smiley::smiley:

1 Like