Deployment Platform: Render ✅

Hi volcanoflash hope you are well!

  1. I have next problem: when deploy teddybears - everything is fine.

Do you mean you have deployed the teddy repository and it works fine on Render?

  1. As soon as I change export.pkl file and labels

Do you mean that you have trained a new model and created a export.pkl file on a different platform like your desktop or a service provider like Google Colab?
and you have put your export.pkl on google shared drive.

    • get net error in log console

You get get the error when you have done 1. and 2. and try to deploy the app.

If you have done exactly what I describe in steps 1,2 and 3 a good start would be to see posts in this thread which contain ‘pip list’ and do as they say.

Cheers mrfabulous1 :smiley::smiley:

if your answer two both tho

1 Like

Does anyone know what the URI is for torchvision 0.2.1? It’s the version I used to export my pkl on Paperspace and I can’t seem to find it on the Pytorch website. I’m having the same issues with the teddy bear classifier repo and version numbers not matching up

Hi korlandril hope all is well!

I haven’t found a specific URL for torchvision 0.2.1

However here are some things you can try,

  1. pip install torchvision==0.2.1

or you can add the following line

  1. RUN pip install torchvision==0.2.1

After the requirements line in the file called “Dockerfile” located in the repository.

  1. As there are lots of dependencies, in some cases where my model was trained months previous, I have just had to retrain it on the current versions of fastai and update all the associated libraries etc in my requirements.txt. In this case this is usally the version installed on Google Colab.

One app uses the following requirements.txt however sometimes minor changes are made in one library, which affects another library and its not always documented, so I have quite a few versions of requirements.txt.

numpy==1.16.1
torchvision==0.2.1
https://download.pytorch.org/whl/cpu/torch-1.0.1.post2-cp37-cp37m-linux_x86_64.whl
fastai==1.0.54
starlette==0.12.0
uvicorn==0.4.6
python-multipart==0.0.5
aiofiles==0.4.0
aiohttp==3.5.4

Hope this helps mrfabulous1 :smiley::smiley: `

2 Likes

Hi, I’m getting the same error message as several people above:

Sep 12 09:04:31 AM  ERROR: Exception in ASGI application
Sep 12 09:04:31 AM  Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 378, in run_asgi
    asgi = app(self.scope)
TypeError: __call__() missing 2 required positional arguments: 'receive' and 'send'
Sep 12 09:04:31 AM  INFO: ('10.104.25.59', 58920) - "GET /favicon.ico HTTP/1.1" 500

I’ve edited my requirements.txt, but when I run ! pip list, the list does not contain entries for starlette, aiofiles, or aoihttp. I’ve tried leaving those blank, and as a shot in the dark also tried the values in the most recent post above.

Any insight as to whether their absence from my pip list causing this error, or whether it might be something else would be much appreciated!

Hi go_go_gadget hope all is well!

I’ve edited my requirements.txt, but when I run ! pip list , the list does not contain entries for starlette, aiofiles, or aoihttp. I’ve tried leaving those blank, and as a shot in the dark also tried the values in the most recent post above.

What platform did you train your model on?

Any insight as to whether their absence from my pip list causing this error, or whether it might be something else would be much appreciated!

Can you show the !pip list of the platform you trained the model on?

Also the client part of the app is written using the starlette library, without this it is impossible for your app to work.

I believe aiofiles, is required if you want to use FileResponse or StaticFiles .
and aoihttp is an Async http client/server framework (asyncio).

We will need to resolve these issues, then it will either work or we will resolve any other issues if there are any

Cheers mrfabulous :smiley::smiley:

So much for the response, @mrfabulous1! I’m using GCP, and here’s my !pip list:

Package Version


alabaster 0.7.12
anaconda-client 1.7.2
anaconda-project 0.8.3
ansiwrap 0.8.4
arrow 0.14.5
asn1crypto 0.24.0
astroid 2.2.5
astropy 3.2.1
atomicwrites 1.3.0
attrs 19.1.0
Babel 2.7.0
backcall 0.1.0
backports.os 0.1.1
backports.shutil-get-terminal-size 1.0.0
bcolz 1.2.1
beautifulsoup4 4.8.0
binaryornot 0.4.4
bitarray 1.0.1
bkcharts 0.2
bleach 3.1.0
bokeh 1.3.4
boto 2.49.0
Bottleneck 1.2.1
cachetools 3.1.1
certifi 2019.6.16
cffi 1.12.3
chardet 3.0.4
Click 7.0
cloudpickle 1.2.1
clyent 1.2.2
colorama 0.4.1
conda 4.7.11
conda-package-handling 1.3.11
configparser 3.8.1
confuse 1.0.0
contextlib2 0.5.5
cookiecutter 1.6.0
cryptography 2.7
cycler 0.10.0
cymem 2.0.2
Cython 0.29.13
cytoolz 0.9.0.1
dask 2.3.0
dataclasses 0.6
datalab 1.1.5
ddt 1.2.1
decorator 4.4.0
defusedxml 0.6.0
dill 0.2.9
distributed 2.3.0
docker 4.0.2
docutils 0.15.2
entrypoints 0.3
enum34 1.1.6
et-xmlfile 1.0.1
fairing 0.5.3
fastai 1.0.57
fastcache 1.1.0
fastprogress 0.1.21
filelock 3.0.12
Flask 1.1.1
fsspec 0.4.0
future 0.17.1
gcsfs 0.3.0
gevent 1.4.0
gitdb2 2.0.5
GitPython 3.0.1
glob2 0.7
gmpy2 2.0.8
google-api-core 1.14.2
google-api-python-client 1.7.11
google-auth 1.6.3
google-auth-httplib2 0.0.3
google-auth-oauthlib 0.4.0
google-cloud-bigquery 1.18.0
google-cloud-core 1.0.3
google-cloud-dataproc 0.5.0
google-cloud-datastore 1.9.0
google-cloud-language 1.3.0
google-cloud-logging 1.12.1
google-cloud-monitoring 0.31.1
google-cloud-spanner 1.10.0
google-cloud-storage 1.18.0
google-cloud-translate 1.6.0
google-resumable-media 0.3.2
googleapis-common-protos 1.6.0
greenlet 0.4.15
grpc-google-iam-v1 0.12.3
grpcio 1.23.0
h5py 2.9.0
heapdict 1.0.0
html5lib 1.0.1
htmlmin 0.1.12
httplib2 0.13.1
idna 2.8
imageio 2.5.0
imagesize 1.1.0
importlib-metadata 0.19
ipykernel 5.1.2
ipython 7.7.0
ipython-genutils 0.2.0
ipython-sql 0.3.9
ipywidgets 7.5.1
isort 4.3.21
itsdangerous 1.1.0
jdcal 1.4.1
jedi 0.15.1
jeepney 0.4
Jinja2 2.10.1
jinja2-time 0.2.0
joblib 0.13.2
json5 0.8.5
jsonschema 3.0.2
jupyter 1.0.0
jupyter-aihub-deploy-extension 0.1
jupyter-client 5.3.1
jupyter-console 6.0.0
jupyter-contrib-core 0.3.3
jupyter-contrib-nbextensions 0.5.1
jupyter-core 4.5.0
jupyter-highlight-selected-word 0.2.0
jupyter-http-over-ws 0.0.6
jupyter-latex-envs 1.4.6
jupyter-nbextensions-configurator 0.4.1
jupyterlab 1.0.2
jupyterlab-git 0.8.1
jupyterlab-server 1.0.0
keyring 18.0.0
kiwisolver 1.1.0
kubernetes 10.0.1
lazy-object-proxy 1.4.1
libarchive-c 2.8
lief 0.9.0
llvmlite 0.29.0
locket 0.2.0
lxml 4.4.1
Markdown 3.1.1
MarkupSafe 1.1.1
matplotlib 3.1.0
mccabe 0.6.1
missingno 0.4.2
mistune 0.8.4
mkl-fft 1.0.14
mkl-random 1.0.2
mkl-service 2.0.2
mock 3.0.5
more-itertools 7.2.0
mpmath 1.1.0
msgpack 0.6.1
msgpack-numpy 0.4.3.2
multipledispatch 0.6.0
murmurhash 1.0.2
nb-conda 2.2.1
nb-conda-kernels 2.2.2
nbconvert 5.6.0
nbdime 1.1.0
nbformat 4.4.0
nbpresent 3.0.2
networkx 2.3
nltk 3.4.4
nose 1.3.7
notebook 6.0.0
notebook-executor 0.1
numba 0.45.1
numexpr 2.7.0
numpy 1.16.4
numpydoc 0.9.1
oauth2client 4.1.3
oauthlib 3.1.0
olefile 0.46
opencv-python 4.1.0.25
openpyxl 2.6.2
packaging 19.1
pandas 0.25.0
pandas-profiling 2.3.0
pandocfilters 1.4.2
papermill 1.1.0
parso 0.5.1
partd 1.0.0
path.py 12.0.1
pathlib2 2.3.4
patsy 0.5.1
pep8 1.7.1
pexpect 4.7.0
phik 0.9.8
pickleshare 0.7.5
Pillow 6.1.0
Pillow-SIMD 6.0.0.post0
pip 19.1.1
pkginfo 1.5.0.1
plac 0.9.6
plotly 4.1.0
pluggy 0.12.0
ply 3.11
poyo 0.5.0
preshed 2.0.1
prettytable 0.7.2
prometheus-client 0.7.1
prompt-toolkit 2.0.9
protobuf 3.9.1
psutil 5.6.3
ptyprocess 0.6.0
py 1.8.0
pyarrow 0.14.1
pyasn1 0.4.6
pyasn1-modules 0.2.6
pycodestyle 2.5.0
pycosat 0.6.3
pycparser 2.19
pycrypto 2.6.1
pycurl 7.43.0.3
pydot 1.4.1
pyflakes 2.1.1
Pygments 2.4.2
pylint 2.3.1
pyodbc 4.0.27
pyOpenSSL 19.0.0
pyparsing 2.4.2
pyrsistent 0.14.11
PySocks 1.7.0
pytest 5.0.1
pytest-arraydiff 0.3
pytest-astropy 0.5.0
pytest-doctestplus 0.3.0
pytest-openfiles 0.3.2
pytest-pylint 0.14.1
pytest-remotedata 0.3.2
python-dateutil 2.8.0
pytz 2019.2
PyWavelets 1.0.3
PyYAML 5.1.2
pyzmq 18.1.0
QtAwesome 0.5.7
qtconsole 4.5.3
QtPy 1.9.0
regex 2018.1.10
requests 2.22.0
requests-oauthlib 1.2.0
retrying 1.3.3
rope 0.14.0
rsa 4.0
ruamel-yaml 0.15.46
scikit-image 0.15.0
scikit-learn 0.21.2
scipy 1.3.1
seaborn 0.9.0
SecretStorage 3.1.1
Send2Trash 1.5.0
setuptools 41.0.1
simplegeneric 0.8.1
singledispatch 3.4.0.3
six 1.12.0
smmap2 2.0.5
snowballstemmer 1.9.0
sortedcollections 1.1.2
sortedcontainers 2.1.0
soupsieve 1.9.2
spacy 2.0.18
Sphinx 2.1.2
sphinxcontrib-applehelp 1.0.1
sphinxcontrib-devhelp 1.0.1
sphinxcontrib-htmlhelp 1.0.2
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 1.0.2
sphinxcontrib-serializinghtml 1.1.3
sphinxcontrib-websupport 1.1.2
spyder 3.3.6
spyder-kernels 0.5.1
SQLAlchemy 1.3.7
sqlparse 0.3.0
statsmodels 0.10.1
sympy 1.4
tables 3.5.2
tblib 1.4.0
tenacity 5.1.1
terminado 0.8.2
testpath 0.4.2
textwrap3 0.9.2
thinc 6.12.1
toolz 0.10.0
torch 1.2.0
torchvision 0.4.0a0+6b959ee
tornado 5.1.1
tqdm 4.34.0
traitlets 4.3.2
typed-ast 1.4.0
typing 3.6.4
ujson 1.35
unicodecsv 0.14.1
uritemplate 3.0.0
urllib3 1.24.2
virtualenv 16.7.3
wcwidth 0.1.7
webencodings 0.5.1
websocket-client 0.56.0
Werkzeug 0.15.5
wheel 0.33.4
whichcraft 0.6.0
widgetsnbextension 3.5.1
wrapt 1.10.11
wurlitzer 1.0.3
xlrd 1.2.0
XlsxWriter 1.1.8
xlwt 1.3.0
zict 1.0.0
zipp 0.5.2

Thanks again!

Hi go_go_gadget hope you are having a jolly day.

In my experience the first thing you must do is make sure your all the libraries listed in your requirements.txt of your repository, are set to the same version as in GCP !pip list.

So your requirements.txt should look similar to the one in post 273 but but have the version numbers of your GCP.

I use google Colab so I have similar issues.

Also If you trained your model months ago it may be worth running it again as the libraries sometimes change but the version number doesn’t.

Let me know the outcome.
If you have done this and its not working let me know the error.

Cheers mrfabulous1 :smiley::smiley:

1 Like

Hi @anurag, will you consider adding other payment options in the future?

Certainly, but we don’t have an estimate for when. Which option works best for you? PayPal?

Paypal would be great.:slightly_smiling_face:

1 Like

Yes, please add paypal support asap! thanks!

Hi, JamesT, I have successfully deploy fastai code by following instruction from “Deploying on Google App Engine” , but ast last steps, I get error “502 Bad Gateway” from “nginx”. Do you know why? Thanks first!

1 Like

Hi, @mrfabulous1! Thanks again for your help.

Here’s my requirements.txt:

numpy==1.16.4
torchvision==0.4.0a0+6b959ee
https://download.pytorch.org/whl/cpu/torch-1.0.1.post2-cp37-cp37m-linux_x86_64.whl
fastai==1.0.57
starlette
uvicorn==0.3.32
python-multipart
aiofiles
aiohttp

I’ve matched the values to those in my !pip list as above, but my list doesn’t contain entries for starlette, aoifiles, or aiohttp. Perhaps I need to install these?

I only trained this model on Thursday, so the libraries are likely the same.

Hi go_go_gadget hope you had a good weekend.

If you started with the current Teddy Bear repository on Github https://github.com/render-examples/fastai-v3/blob/master/requirements.txt the latest requirements.txt is as follows.

aiofiles==0.4.0
aiohttp==3.5.4
asyncio==3.4.3
fastai==1.0.57
https://download.pytorch.org/whl/cpu/torch-1.1.0-cp37-cp37m-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.3.0-cp37-cp37m-linux_x86_64.whl
numpy==1.16.4
starlette==0.12.0
uvicorn==0.7.1
python-multipart==0.0.5

I have ammended it for your GCP versions of fastai and numpy.

I suggest you use the requirements.txt above as yours doesn’t have values for startlette, python-multipart, aiofiles, aiohttp and asyncio setting is missing completely. You should not have to install any libraries, this is being done by Docker. Check against your GCP settings if any libraries have higher versions then you only need change those numbers.

Once again please send a copy of any error you get and the requirements.txt you are using on render.com, when you reply. With so many inconsistencies in your requirements.txt at the moment, its very difficult to resolve the issue, we must get this right.

Hope this helps.

mrfabulous1 :smiley::smiley:

2 Likes

Thank you, @mrfabulous1! I sincerely appreciate all of your help.

The app is rendering now! It’s displaying the text for the teddy bear model, however, though it’s correctly running my classifier (Picasso vs. Monet).

Here’s a screenshot:

I think I must need to edit the code displaying the text, but I can’t tell from the server.py file which part of the code to edit (I’m sorry, I’m very inexperienced!).

Here’s the server.py file:

from starlette.applications import Starlette
from starlette.responses import HTMLResponse, JSONResponse
from starlette.staticfiles import StaticFiles
from starlette.middleware.cors import CORSMiddleware
import uvicorn, aiohttp, asyncio
from io import BytesIO

from fastai import *
from fastai.vision import *

export_file_url = ‘https://www.googleapis.com/drive/v3/files/1dDW2hBlmM7rqEjovUepNOHg0Z23s6WIg?alt=media&key=AIzaSyCreuiBOuN4ae5cvzlh8cIB9iY8tUeSMik
export_file_name = ‘export.pkl’

classes = [‘picasso’, ‘monet’]
path = Path(file).parent

app = Starlette()
app.add_middleware(CORSMiddleware, allow_origins=[’*’], allow_headers=[‘X-Requested-With’, ‘Content-Type’])
app.mount(’/static’, StaticFiles(directory=‘app/static’))

async def download_file(url, dest):
if dest.exists(): return
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
data = await response.read()
with open(dest, ‘wb’) as f: f.write(data)

async def setup_learner():
await download_file(export_file_url, path/export_file_name)
try:
learn = load_learner(path, export_file_name)
return learn
except RuntimeError as e:
if len(e.args) > 0 and ‘CPU-only machine’ in e.args[0]:
print(e)
message = “\n\nThis model was trained with an old version of fastai and will not work in a CPU environment.\n\nPlease update the fastai library in your training environment and export your model again.\n\nSee instructions for ‘Returning to work’ at https://course.fast.ai.”
raise RuntimeError(message)
else:
raise

loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(setup_learner())]
learn = loop.run_until_complete(asyncio.gather(*tasks))[0]
loop.close()

@app.route(’/’)
def index(request):
html = path/‘view’/‘index.html’
return HTMLResponse(html.open().read())

@app.route(’/analyze’, methods=[‘POST’])
async def analyze(request):
data = await request.form()
img_bytes = await (data[‘file’].read())
img = open_image(BytesIO(img_bytes))
prediction = learn.predict(img)[0]
return JSONResponse({‘result’: str(prediction)})

if name == ‘main’:
if ‘serve’ in sys.argv: uvicorn.run(app=app, host=‘0.0.0.0’, port=5042)

Link to web app

Thanks again for your time!
g0g0gadget

1 Like

Hi go_go_gadget hope you are having a jolly day!

I am glad to hear that your model is now working!

To change the text in the html page, edit the index.html page in the view directory.

image

Have a wonderful evening.

mrfabulous1 :smiley::smiley:

2 Likes

Yay! It’s working! Thank you again, so very much!

Sincerely,
g0g0gadget

1 Like

Hi go_go_gadget
You’re Welcome!
mrfabulous1 :smiley::smiley::smiley:

1 Like

Hi, I copied the requirements from my Google Colab environment, but I got the following trace:

File “/usr/local/lib/python3.7/asyncio/base_events.py”, line 579, in run_until_complete
return future.result()
File “app/server.py”, line 35, in setup_learner
learn = load_learner(path, export_file_name)
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 628, in load_learner
res.callbacks = [load_callback(c,s, res) for c,s in cb_state.items()]
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 628, in
res.callbacks = [load_callback(c,s, res) for c,s in cb_state.items()]
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 612, in load_callback
res = class_func(learn, **init_kwargs) if issubclass(class_func, LearnerCallback) else class_func(**init_kwargs)
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 461, in init
self.opt = self.learn.opt
AttributeError: ‘Learner’ object has no attribute ‘opt’

Any clue? seems to be related to the fastai version.

Hi!

I’m having the same problem as many other people here where the classifier gets stuck in the “analysing”-phase.

I have at least managed to change the original bear text to my own, so at least something is right :wink:

I have updated the requirements.txt in my forked repository with the versions from when I write “!pip list” in my Jupyter notebook in Paperspace. There I get the following:

Package Version


asn1crypto 0.24.0
attrs 18.2.0
backcall 0.1.0
beautifulsoup4 4.7.1
bleach 3.1.0
Bottleneck 1.2.1
certifi 2018.11.29
cffi 1.11.5
chardet 3.0.4
cryptography 2.3.1
cycler 0.10.0
cymem 2.0.2
cytoolz 0.9.0.1
dataclasses 0.6
decorator 4.3.0
dill 0.2.8.2
entrypoints 0.3
fastai 1.0.55
fastprogress 0.1.21
idna 2.8
ipykernel 5.1.0
ipython 7.2.0
ipython-genutils 0.2.0
ipywidgets 7.4.2
jedi 0.13.2
Jinja2 2.10
jsonschema 3.0.0a3
jupyter 1.0.0
jupyter-client 5.2.4
jupyter-console 6.0.0
jupyter-core 4.4.0
kiwisolver 1.0.1
MarkupSafe 1.1.0
matplotlib 3.0.2
mistune 0.8.4
mkl-fft 1.0.10
mkl-random 1.0.2
msgpack 0.5.6
msgpack-numpy 0.4.3.2
murmurhash 1.0.0
nb-conda 2.2.1
nb-conda-kernels 2.2.0
nbconvert 5.3.1
nbformat 4.4.0
notebook 5.7.4
numexpr 2.6.9
numpy 1.15.4
nvidia-ml-py3 7.352.0
olefile 0.46
packaging 19.0
pandas 0.23.4
pandocfilters 1.4.2
parso 0.3.1
pexpect 4.6.0
pickleshare 0.7.5
Pillow 5.4.1
pip 18.1
plac 0.9.6
preshed 2.0.1
prometheus-client 0.5.0
prompt-toolkit 2.0.7
ptyprocess 0.6.0
pycparser 2.19
Pygments 2.3.1
pyOpenSSL 18.0.0
pyparsing 2.3.1
pyrsistent 0.14.9
PySocks 1.6.8
python-dateutil 2.7.5
pytz 2018.9
PyYAML 3.13
pyzmq 17.1.2
qtconsole 4.4.3
regex 2018.1.10
requests 2.21.0
scipy 1.2.0
Send2Trash 1.5.0
setuptools 40.6.3
six 1.12.0
soupsieve 1.7.1
spacy 2.0.18
terminado 0.8.1
testpath 0.4.2
thinc 6.12.1
toolz 0.9.0
torch 1.0.0
torchvision 0.2.1
tornado 5.1.1
tqdm 4.29.1
traitlets 4.3.2
typing 3.6.4
ujson 1.35
urllib3 1.24.1
wcwidth 0.1.7
webencodings 0.5.1
wheel 0.32.3
widgetsnbextension 3.4.2
wrapt 1.10.11

My requirement.txt:
aiofiles==0.4.0
aiohttp==3.5.4
asyncio==3.4.3
fastai==1.0.55
https://download.pytorch.org/whl/cpu/torch-1.1.0-cp37-cp37m-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.3.0-cp37-cp37m-linux_x86_64.whl
numpy==1.15.4
starlette==0.12.0
uvicorn==0.7.1
python-multipart==0.0.5

When I upload an image in the classifier online I get the following text in the “log” tab in Render.

Sep 22 04:38:26 PM INFO: (‘10.104.55.126’, 38382) - “POST /analyze HTTP/1.1” 500
Sep 22 04:38:26 PM ERROR: Exception in ASGI application
Sep 22 04:38:26 PM Traceback (most recent call last):
File “/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py”, line 368, in run_asgi
result = await app(self.scope, self.receive, self.send)
File “/usr/local/lib/python3.7/site-packages/starlette/applications.py”, line 133, in call
await self.error_middleware(scope, receive, send)
File “/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py”, line 122, in call
raise exc from None
File “/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py”, line 100, in call
await self.app(scope, receive, _send)
File “/usr/local/lib/python3.7/site-packages/starlette/middleware/cors.py”, line 84, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File “/usr/local/lib/python3.7/site-packages/starlette/middleware/cors.py”, line 140, in simple_response
await self.app(scope, receive, send)
File “/usr/local/lib/python3.7/site-packages/starlette/exceptions.py”, line 73, in call
raise exc from None
File “/usr/local/lib/python3.7/site-packages/starlette/exceptions.py”, line 62, in call
await self.app(scope, receive, sender)
File “/usr/local/lib/python3.7/site-packages/starlette/routing.py”, line 585, in call
await route(scope, receive, send)
File “/usr/local/lib/python3.7/site-packages/starlette/routing.py”, line 207, in call
await self.app(scope, receive, send)
File “/usr/local/lib/python3.7/site-packages/starlette/routing.py”, line 40, in app
response = await func(request)
File “app/server.py”, line 63, in analyze
prediction = learn.predict(img)[0]
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 366, in predict
res = self.pred_batch(batch=batch, with_dropout=with_dropout)
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 345, in pred_batch
if not with_dropout: preds = loss_batch(self.model.eval(), xb, yb, cb_handler=cb_handler)
File “/usr/local/lib/python3.7/site-packages/fastai/basic_train.py”, line 26, in loss_batch
out = model(*xb)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 493, in call
result = self.forward(*input, **kwargs)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/container.py”, line 92, in forward
input = module(input)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 493, in call
result = self.forward(*input, **kwargs)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/container.py”, line 92, in forward
input = module(input)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 493, in call
result = self.forward(*input, **kwargs)
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/conv.py”, line 331, in forward
if self.padding_mode == ‘circular’:
File “/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 539, in getattr
type(self).name, name))
AttributeError: ‘Conv2d’ object has no attribute ‘padding_mode’

Would very much appreciate if someone could help me out. Thanks in advance :slight_smile: