Anyone using heroku to deploy a fastai2 model?

You still have the pkl in your repo so it’s still being copied to your slug. You need to remove it from there. Even after that 924 sounds a lot.

What’s packaging in your requirements.txt? You don’t seem to be using it.

I am now using this repo without the pkl file:

This is the content of requirements.txt:

voila
fastai2>=0.0.16
pillow>=7.1.0
packaging
ipywidgets==7.5.1

It looks that when it deploys it grows up to 924 MB

Like I said, I think you don’t need packaging in your requirements.txt. have you tried removing that?

Are you indicating you want to use the cpu version of torch in your requirements.txt? You don’t need the full torch package, just the cpu version.

Doh! That’ll be it.

so what do you suggest to put exactly in the requirements.txt?

I tested it with:
voila
fastai2>=0.0.16
pillow>=7.1.0
ipywidgets==7.5.1

and the result is again:
" Compiled slug size: 924.1M is too large (max is 500M)."

I also tested the following.

I changed this in requirements:
voila
fastai2>=0.0.16
pillow>=7.1.0
ipywidgets==7.5.1

into this:
https://download.pytorch.org/whl/cpu/torch-1.6.0%2Bcpu-cp38-cp38-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.7.0%2Bcpu-cp38-cp38-linux_x86_64.whl
fastai==2.0.11
voila
ipywidgets

and the error is the same as above: " Compiled slug size: 924.1M is too large (max is 500M)."

I don’t want to start a new topic about this, so I’m hoping someone notices this. I have gone through all the troubleshooting I could find on this forum, and I have no issues with my slug size and my app is deployed but it only goes as far uploading the 128 by 128 image and doesn’t return the prediction/probability value (even though it works in the Jupyter notebook on paperspace).

I had a look at the build and there’s an error about the pytorch wheels being incompatible, so I’m suspecting that that’s the culprit (if not, then the noose beckons at this point :ghost:).

Is anyone kind enough to tell me which are the latest ones i.e compatible with fast ai 2.2.3? I found this list somewhere, but I couldn’t make heads or tails of it.

.

So I solved the problem. To anyone who may be reading this in the future, have a look at the repos in this thread to see what you need, but remember your requirements.txt needs to have the latest pytorch wheels compatible with:

  1. the version of fastai that you’re using.
  2. the version of python that you’re using.

The list can be found here. If everything is working in Jupyter notebook, but not when you deploy your model then you can check for the version of fastai that you’re using, by typing the following commands in your notebook:

import fastai
print (fastai.__version__)

So you want to include two wheels in your requirements.txt, both starting with:
https://download.pytorch.org/whl

Then from the list you want to choose the ones starting with cpu. First you need a wheel for torch. Choose the linux one, there will be multiple linux ones too, to pick a working one, it’s ideal that it’s compatible with the version of python you’re using. This is reflected in the ‘cp’ part of the link. So, for example cp38 refers to version 3.8 of python (do take this into account if you’re including a runtime.txt in your repo).

Once you make your choice, add it to the hyperlink above, so if you refer to one of the posts above mine. @enr would have chosen “cpu/torch-1.6.0%2Bcpu-cp38-cp38-linux_x86_64.whl”

You do the same thing for the second hyperlink, this time for torchvision rather than torch. I don’t know what version of pytorch is compatible with which version of fastai, but if your app fails due to this issue, then the ‘build logs’ should indicate that the wheel is ‘not supported on this platform’. Errors in the application logs should also guide you in the right direction. (click on ‘More’ in the top right hand corner and choose ‘logs’ from the dropdown ).

I don’t know if I’ve got everything right, I’ve been able to piece everything together thanks to everyone on this forum and particularly due to @joedockrill and @ringoo’s efforts.

3 Likes

This worked for me. Thanks! I used python 3.8.5 in the runtime.txt and the following in requirements.txt

https://download.pytorch.org/whl/cpu/torch-1.7.1%2Bcpu-cp38-cp38-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.8.2%2Bcpu-cp38-cp38-linux_x86_64.whl
fastai==2.2.5
voila
ipywidgets

1 Like

okey, the problem was with the torch versions I was using as mlitchti wrote above, thanks!!

How to Deploy Fast.ai Models? (Voilà , Binder and Heroku)

Medium article:

code:

I hope this may help.

You’re a lifesaver thanks!! I was going crazy trying to figure out why it was running locally and not on heroku :sweat_smile:

I’m getting errors for each version from list of torch stable versions. like

`{specific_version} is not a supported wheel on this platform

Try to put this into your requirements.txt(it’s image)

-f https://download.pytorch.org/whl/torch_stable.html
torch==1.8.1+cpu 
torchvision==0.9.1+cpu
fastai>=2.3.1
ipywidgets
voila

Hi everybody,
I’ve been following your advice to deploy my model on heroku, especially yours @AGSL and it seemed to work well, as the debugging message was:

Warning: Your slug size (389 MB) exceeds our soft limit (300 MB) which may affect boot time.
Released v5
https://bear-app69.herokuapp.com/ deployed to Heroku

even if I had this warning message, it was well deployed.
Then I try to open my app, but I got this message everytime I’ve tried:

“There was an error when executing cell [2]. Please run Voilà with --show_tracebacks=True or --debug to see the error message, or configure VoilaConfiguration.show_tracebacks.”

and the log message is:

2021-06-08T12:47:12.796398+00:00 app[web.1]: [Voila] Writing notebook-signing key to /app/.local/share/jupyter/notebook_secret
2021-06-08T12:47:12.797089+00:00 app[web.1]: [Voila] WARNING | Notebook bear_classifier.ipynb is not trusted
2021-06-08T12:47:12.805861+00:00 app[web.1]: [Voila] WARNING | Could not find a kernel named ‘fastai’, will use ‘python3’
2021-06-08T12:47:13.065290+00:00 app[web.1]: [Voila] Kernel started: a9b94e88-cf4c-4b63-97bf-566db2ff199e
2021-06-08T12:47:18.551524+00:00 app[web.1]: [Voila] ERROR | Error at server while executing cell: {‘cell_type’: ‘code’, ‘execution_count’: 2, ‘id’: ‘eb6585ea’, ‘metadata’: {‘trusted’: False, ‘execution’: {‘iopub.status.busy’: ‘2021-06-08T12:47:18.205440Z’, ‘iopub.execute_input’: ‘2021-06-08T12:47:18.206177Z’, ‘shell.execute_reply’: ‘2021-06-08T12:47:18.548654Z’, ‘iopub.status.idle’: ‘2021-06-08T12:47:18.550045Z’}}, ‘outputs’: [{‘output_type’: ‘error’, ‘ename’: ‘UnpicklingError’, ‘evalue’: “invalid load key, ‘v’.”, ‘traceback’: [’\x1b[0;31m---------------------------------------------------------------------------\x1b[0m’, ‘\x1b[0;31mUnpicklingError\x1b[0m Traceback (most recent call last)’, “\x1b[0;32m\x1b[0m in \x1b[0;36m\x1b[0;34m\x1b[0m\n\x1b[1;32m 1\x1b[0m \x1b[0mpath\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mPath\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0;32m----> 2\x1b[0;31m \x1b[0mlearn_inf\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mload_learner\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mpath\x1b[0m\x1b[0;34m/\x1b[0m\x1b[0;34m’export.pkl’\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mcpu\x1b[0m\x1b[0;34m=\x1b[0m\x1b[0;32mTrue\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0m\x1b[1;32m 3\x1b[0m \x1b[0mbtn_upload\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mwidgets\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mFileUpload\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 4\x1b[0m \x1b[0mout_pl\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mwidgets\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mOutput\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 5\x1b[0m \x1b[0mlbl_pred\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mwidgets\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mLabel\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n”, ‘\x1b[0;32m~/.heroku/python/lib/python3.9/site-packages/fastai/learner.py\x1b[0m in \x1b[0;36mload_learner\x1b[0;34m(fname, cpu, pickle_module)\x1b[0m\n\x1b[1;32m 379\x1b[0m \x1b[0;34m"Load a Learner object in fname, optionally putting it on the cpu"\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 380\x1b[0m \x1b[0mdistrib_barrier\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0;32m–> 381\x1b[0;31m \x1b[0mres\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mtorch\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mload\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mfname\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mmap_location\x1b[0m\x1b[0;34m=\x1b[0m\x1b[0;34m’cpu’\x1b[0m \x1b[0;32mif\x1b[0m \x1b[0mcpu\x1b[0m \x1b[0;32melse\x1b[0m \x1b[0;32mNone\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mpickle_module\x1b[0m\x1b[0;34m=\x1b[0m\x1b[0mpickle_module\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0m\x1b[1;32m 382\x1b[0m \x1b[0;32mif\x1b[0m \x1b[0mhasattr\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mres\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0;34m’to_fp32’\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m:\x1b[0m \x1b[0mres\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mres\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mto_fp32\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 383\x1b[0m \x1b[0;32mif\x1b[0m \x1b[0mcpu\x1b[0m\x1b[0;34m:\x1b[0m \x1b[0mres\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mdls\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mcpu\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n’, ‘\x1b[0;32m~/.heroku/python/lib/python3.9/site-packages/torch/serialization.py\x1b[0m in \x1b[0;36mload\x1b[0;34m(f, map_location, pickle_module, pickle_load_args)\x1b[0m\n\x1b[1;32m 591\x1b[0m \x1b[0;32mreturn\x1b[0m \x1b[0mtorch\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mjit\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mload\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mopened_file\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 592\x1b[0m \x1b[0;32mreturn\x1b[0m \x1b[0m_load\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mopened_zipfile\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mmap_location\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mpickle_module\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0;34m\x1b[0m\x1b[0mpickle_load_args\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0;32m–> 593\x1b[0;31m \x1b[0;32mreturn\x1b[0m \x1b[0m_legacy_load\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mopened_file\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mmap_location\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0mpickle_module\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0;34m**\x1b[0m\x1b[0mpickle_load_args\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0m\x1b[1;32m 594\x1b[0m \x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 595\x1b[0m \x1b[0;34m\x1b[0m\x1b[0m\n’, ‘\x1b[0;32m~/.heroku/python/lib/python3.9/site-packages/torch/serialization.py\x1b[0m in \x1b[0;36m_legacy_load\x1b[0;34m(f, map_location, pickle_module, pickle_load_args)\x1b[0m\n\x1b[1;32m 760\x1b[0m “functionality.”)\n\x1b[1;32m 761\x1b[0m \x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0;32m–> 762\x1b[0;31m \x1b[0mmagic_number\x1b[0m \x1b[0;34m=\x1b[0m \x1b[0mpickle_module\x1b[0m\x1b[0;34m.\x1b[0m\x1b[0mload\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0mf\x1b[0m\x1b[0;34m,\x1b[0m \x1b[0;34m\x1b[0m\x1b[0mpickle_load_args\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[0m\x1b[1;32m 763\x1b[0m \x1b[0;32mif\x1b[0m \x1b[0mmagic_number\x1b[0m \x1b[0;34m!=\x1b[0m \x1b[0mMAGIC_NUMBER\x1b[0m\x1b[0;34m:\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n\x1b[1;32m 764\x1b[0m \x1b[0;32mraise\x1b[0m \x1b[0mRuntimeError\x1b[0m\x1b[0;34m(\x1b[0m\x1b[0;34m"Invalid magic number; corrupt file?"\x1b[0m\x1b[0;34m)\x1b[0m\x1b[0;34m\x1b[0m\x1b[0;34m\x1b[0m\x1b[0m\n’, “\x1b[0;31mUnpicklingError\x1b[0m: invalid load key, ‘v’.”]}], ‘source’: “path = Path()\nlearn_inf = load_learner(path/‘export.pkl’, cpu=True)\nbtn_upload = widgets.FileUpload()\nout_pl = widgets.Output()\nlbl_pred = widgets.Label()”}
2021-06-08T12:47:18.551536+00:00 app[web.1]: Traceback (most recent call last):
2021-06-08T12:47:18.551536+00:00 app[web.1]: File “/app/.heroku/python/lib/python3.9/site-packages/voila/handler.py”, line 209, in _jinja_cell_generator
2021-06-08T12:47:18.551537+00:00 app[web.1]: output_cell = await task
2021-06-08T12:47:18.551537+00:00 app[web.1]: File “/app/.heroku/python/lib/python3.9/site-packages/voila/execute.py”, line 69, in execute_cell
2021-06-08T12:47:18.551538+00:00 app[web.1]: result = await self.async_execute_cell(cell, cell_index, store_history)
2021-06-08T12:47:18.551538+00:00 app[web.1]: File “/app/.heroku/python/lib/python3.9/site-packages/nbclient/client.py”, line 857, in async_execute_cell
2021-06-08T12:47:18.551539+00:00 app[web.1]: self._check_raise_for_error(cell, exec_reply)
2021-06-08T12:47:18.551539+00:00 app[web.1]: File “/app/.heroku/python/lib/python3.9/site-packages/nbclient/client.py”, line 760, in _check_raise_for_error
2021-06-08T12:47:18.551539+00:00 app[web.1]: raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
2021-06-08T12:47:18.551540+00:00 app[web.1]: nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
2021-06-08T12:47:18.551540+00:00 app[web.1]: ------------------
2021-06-08T12:47:18.551540+00:00 app[web.1]: path = Path()
2021-06-08T12:47:18.551541+00:00 app[web.1]: learn_inf = load_learner(path/‘export.pkl’, cpu=True)
2021-06-08T12:47:18.551541+00:00 app[web.1]: btn_upload = widgets.FileUpload()
2021-06-08T12:47:18.551541+00:00 app[web.1]: out_pl = widgets.Output()
2021-06-08T12:47:18.551541+00:00 app[web.1]: lbl_pred = widgets.Label()
2021-06-08T12:47:18.551542+00:00 app[web.1]: ------------------
2021-06-08T12:47:18.551542+00:00 app[web.1]:
2021-06-08T12:47:18.551542+00:00 app[web.1]: e[0;31m---------------------------------------------------------------------------e[0m
2021-06-08T12:47:18.551542+00:00 app[web.1]: e[0;31mUnpicklingErrore[0m Traceback (most recent call last)
2021-06-08T12:47:18.551543+00:00 app[web.1]: e[0;32me[0m in e[0;36me[0;34me[0m
2021-06-08T12:47:18.551543+00:00 app[web.1]: e[1;32m 1e[0m e[0mpathe[0m e[0;34m=e[0m e[0mPathe[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551544+00:00 app[web.1]: e[0;32m----> 2e[0;31m e[0mlearn_infe[0m e[0;34m=e[0m e[0mload_learnere[0me[0;34m(e[0me[0mpathe[0me[0;34m/e[0me[0;34m’export.pkl’e[0me[0;34m,e[0m e[0mcpue[0me[0;34m=e[0me[0;32mTruee[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551545+00:00 app[web.1]: e[0me[1;32m 3e[0m e[0mbtn_uploade[0m e[0;34m=e[0m e[0mwidgetse[0me[0;34m.e[0me[0mFileUploade[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551545+00:00 app[web.1]: e[1;32m 4e[0m e[0mout_ple[0m e[0;34m=e[0m e[0mwidgetse[0me[0;34m.e[0me[0mOutpute[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551546+00:00 app[web.1]: e[1;32m 5e[0m e[0mlbl_prede[0m e[0;34m=e[0m e[0mwidgetse[0me[0;34m.e[0me[0mLabele[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551546+00:00 app[web.1]:
2021-06-08T12:47:18.551546+00:00 app[web.1]: e[0;32m~/.heroku/python/lib/python3.9/site-packages/fastai/learner.pye[0m in e[0;36mload_learnere[0;34m(fname, cpu, pickle_module)e[0m
2021-06-08T12:47:18.551546+00:00 app[web.1]: e[1;32m 379e[0m e[0;34m"Load a Learner object in fname, optionally putting it on the cpu"e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551552+00:00 app[web.1]: e[1;32m 380e[0m e[0mdistrib_barriere[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551553+00:00 app[web.1]: e[0;32m–> 381e[0;31m e[0mrese[0m e[0;34m=e[0m e[0mtorche[0me[0;34m.e[0me[0mloade[0me[0;34m(e[0me[0mfnamee[0me[0;34m,e[0m e[0mmap_locatione[0me[0;34m=e[0me[0;34m’cpu’e[0m e[0;32mife[0m e[0mcpue[0m e[0;32melsee[0m e[0;32mNonee[0me[0;34m,e[0m e[0mpickle_modulee[0me[0;34m=e[0me[0mpickle_modulee[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551554+00:00 app[web.1]: e[0me[1;32m 382e[0m e[0;32mife[0m e[0mhasattre[0me[0;34m(e[0me[0mrese[0me[0;34m,e[0m e[0;34m’to_fp32’e[0me[0;34m)e[0me[0;34m:e[0m e[0mrese[0m e[0;34m=e[0m e[0mrese[0me[0;34m.e[0me[0mto_fp32e[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551561+00:00 app[web.1]: e[1;32m 383e[0m e[0;32mife[0m e[0mcpue[0me[0;34m:e[0m e[0mrese[0me[0;34m.e[0me[0mdlse[0me[0;34m.e[0me[0mcpue[0me[0;34m(e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551561+00:00 app[web.1]:
2021-06-08T12:47:18.551561+00:00 app[web.1]: e[0;32m~/.heroku/python/lib/python3.9/site-packages/torch/serialization.pye[0m in e[0;36mloade[0;34m(f, map_location, pickle_module, pickle_load_args)e[0m
2021-06-08T12:47:18.551562+00:00 app[web.1]: e[1;32m 591e[0m e[0;32mreturne[0m e[0mtorche[0me[0;34m.e[0me[0mjite[0me[0;34m.e[0me[0mloade[0me[0;34m(e[0me[0mopened_filee[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551562+00:00 app[web.1]: e[1;32m 592e[0m e[0;32mreturne[0m e[0m_loade[0me[0;34m(e[0me[0mopened_zipfilee[0me[0;34m,e[0m e[0mmap_locatione[0me[0;34m,e[0m e[0mpickle_modulee[0me[0;34m,e[0m e[0;34m
e[0me[0mpickle_load_argse[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551562+00:00 app[web.1]: e[0;32m–> 593e[0;31m e[0;32mreturne[0m e[0m_legacy_loade[0me[0;34m(e[0me[0mopened_filee[0me[0;34m,e[0m e[0mmap_locatione[0me[0;34m,e[0m e[0mpickle_modulee[0me[0;34m,e[0m e[0;34m**e[0me[0mpickle_load_argse[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551563+00:00 app[web.1]: e[0me[1;32m 594e[0m e[0;34me[0me[0m
2021-06-08T12:47:18.551563+00:00 app[web.1]: e[1;32m 595e[0m e[0;34me[0me[0m
2021-06-08T12:47:18.551563+00:00 app[web.1]:
2021-06-08T12:47:18.551563+00:00 app[web.1]: e[0;32m~/.heroku/python/lib/python3.9/site-packages/torch/serialization.pye[0m in e[0;36m_legacy_loade[0;34m(f, map_location, pickle_module, pickle_load_args)e[0m
2021-06-08T12:47:18.551563+00:00 app[web.1]: e[1;32m 760e[0m “functionality.”)
2021-06-08T12:47:18.551564+00:00 app[web.1]: e[1;32m 761e[0m e[0;34me[0me[0m
2021-06-08T12:47:18.551564+00:00 app[web.1]: e[0;32m–> 762e[0;31m e[0mmagic_numbere[0m e[0;34m=e[0m e[0mpickle_modulee[0me[0;34m.e[0me[0mloade[0me[0;34m(e[0me[0mfe[0me[0;34m,e[0m e[0;34m
e[0me[0mpickle_load_argse[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551564+00:00 app[web.1]: e[0me[1;32m 763e[0m e[0;32mife[0m e[0mmagic_numbere[0m e[0;34m!=e[0m e[0mMAGIC_NUMBERe[0me[0;34m:e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551564+00:00 app[web.1]: e[1;32m 764e[0m e[0;32mraisee[0m e[0mRuntimeErrore[0me[0;34m(e[0me[0;34m"Invalid magic number; corrupt file?"e[0me[0;34m)e[0me[0;34me[0me[0;34me[0me[0m
2021-06-08T12:47:18.551564+00:00 app[web.1]:
2021-06-08T12:47:18.551565+00:00 app[web.1]: e[0;31mUnpicklingErrore[0m: invalid load key, ‘v’.
2021-06-08T12:47:18.551565+00:00 app[web.1]: UnpicklingError: invalid load key, ‘v’.
2021-06-08T12:47:18.551565+00:00 app[web.1]:

I don’t really get what is wrong as the deployment went well, could someone help me out? :slight_smile:

I finally managed to solve my problem. I created a repo from scratch on github (my repo is now at: GitHub - badgiojuni/Bear-finder: I successfully deployed a web application from Lesson2/3 of fastai book on mybinder & heroku. It allows to differentiate a bear from a teddy bear, you can find the app at: https://bear-app69.herokuapp.com/) and used the ‘requirements.txt’ from How to Deploy Fast.ai Models? (Voilà , Binder and Heroku) | by Aravinda Gayan | unpackAI | Medium and the steps from Deploying your notebook as an app under 10 minutes to deploy it on Binder. It worked well, I think I did smth wrong when I followed the git related steps (the git track “*.pkl” command wasn’t of any use for me, instead I just added the export.pkl file to the cloned repo and it worked well).
It worked as well on heroku by adding a procfile file as it is explained in the medium article from above. This was the only thing I needed to add to make it work on heroku after having made it work on binder. Thanks for the helpful indications on the forum

How to Deploy Fast.ai Models? (Voilà , Binder and Heroku)

Medium article:
https://medium.com/unpackai/how-to-deploy-fast-ai-models-8704ea711ad2

code:

I hope this may help.

2 Likes

There is a new problem with deploying the app. The new version 8.3.0 of Pillow breaks the predictions so the heroku apps won’t display any predictions. RuntimeError: Could not infer dtype of PILImage - #2 by crissman

I added the old version of Pillow to the requirements and the app now works on heroku and binder. This is what my requirements.txt looks like now:

https://download.pytorch.org/whl/cpu/torch-1.9.0%2Bcpu-cp38-cp38-linux_x86_64.whl
https://download.pytorch.org/whl/cpu/torchvision-0.10.0%2Bcpu-cp38-cp38-linux_x86_64.whl
fastai==2.4
voila==0.2.10
ipywidgets==7.5.1
pillow==8.2

1 Like

Here is what works for me trying to deploy the bear classifier model using heroku in 2022

# requirements.txt
-f https://download.pytorch.org/whl/torch_stable.html

torch==1.11.0+cpu

torchvision==0.12.0+cpu

fastai==2.6.3

voila

ipywidgets

Why this works and not the above suggestions:

  • Mainly because fastai has been updated and each fastai version specifies which torch and torchvision module to be compatible with it

You can see this by trying to deploy to heroku with this requirements.txt file and watch which fastai and torch it downloads/requires

# requirements.txt without specifying wheels for torch
fastai

ipywidgets

voila

For example, my fastai version is 2.6.3 and it requires:

Collecting torch<1.12,>=1.7.0
         Downloading torch-1.11.0-cp310-cp310-manylinux1_x86_64.whl (750.6 MB)


Collecting torchvision>=0.8.2
         Downloading torchvision-0.12.0-cp310-cp310-manylinux1_x86_64.whl (21.0 MB)

From this log, we can modify the requirements.txt accordingly (with the pattern above) so that the right version is used

torch==1.11.0+cpu

torchvision==0.12.0+cpu

fastai=2.6.3

Hope this helps someone trying this deployment in the future

For simple deployments Gradio+HF spaces seems like a good combo. Jeremy has covered these recently in some of his 2022 Kaggle notebooks (check out his Kaggle notebooks).

1 Like