Free Web Deployment?

How are you presenting to the judges? I know I’ve deployed locally before (but this was an in-person science-fair like scenario). Otherwise Render I believe has a free trial-based tier you can use :slight_smile:

You can find the megathread here:

With directions here:
https://course.fast.ai/deployment_render.html (included there also are a few other options)

3 Likes

Google Cloud Platform gives $300 credit when you sign up. You’d need to have a credit card handy and cancel before the credit is used up.

A different idea is to put it on Kaggle as a public notebook. You’d get plenty of fast.ai’ers upvoting the solution :slight_smile: The sharing side of what you’ve done would be a credit in itself. Maybe do both.

@muellerzr unfortunately Render needs $7/month for an instance of web deployment, which isnt’ something that I like… where is the free trial for Render?

James

You can host a single web app on pythonanywhere.com for free.
Good luck!

1 Like

Hi jfang the free trial has ended on render.com now.

I know some Brazilian users use heroku.com

Below are two links, one contains a few posts about heroku.com and the other is a currently working heroku.com deployment using fastai.


https://capydetector.herokuapp.com/

I believe you have to do some tweaking as you are only allowed a small amount of disk space. (this sometimes means making sure your model is small and you can fit the rest of your app in the small amount of disk space) provided.

Good luck mrfabulous1 :smiley::smiley:

2 Likes

Well, thanks for telling me about Heroku! How would deploying on Heroku work? How should I get started with Heroku, given that this is my first time deploying a web application?

EDIT: There are a few features that I want: I want the user to be able to upload one row of Pandas DataFrame data (which will be converted to an image for a CNN to classify), then possibly use test-time augmentation (option, I may not want too long to return results), and finally PLOTTING the Grad-CAM of the predictions on Heroku. If it’s too big, then I could still cut it down further, or maybe try porting it to Kaggle/Google Colab for the task.

If you can’t get it running on heroku I can host it for you given that not a lot of people will use it (to keep cpu usage low)

Hi jfang

In the links in my previous post, you will find some people have kindly posted their repositories of their solutions on github.com.

With regards to it being your first time, I believe every body on this site has had a first time of deploying an App online.

Understanding CNN and PLOTTING Grad-CAM sounds way more tricky to understand than deploying on heroku.com

I am sure you will get it working easily.

Cheers mrfabulous1 :smiley::smiley:

Hi James, there are a couple of free options for Azure if you have issues with Heroku. The first requires a credit card, but does not charge it - just for ID (or if you progress to a paid account). https://azure.microsoft.com/en-us/free this comes with $200 credit for a month which should be fine for your needs - then continues with some free services for a year. The non credit card option is the Visual Studio Dev Essentials https://visualstudio.microsoft.com/dev-essentials/ which you can get free WebApp on Azure but it does have a 1 hour lifetime. If you could re-deploy your solution from git then this might work. You can also add the $200 credit option to Dev Essentials too. I did try Heroku but I think I had issues with the model size - and as I work for Microsoft the Azure route was easy for me. https://medium.com/@lunchwithalens/deploying-my-fastai-predictor-to-microsoft-azure-c7e635d464a1 Reach out if you need anything.
Best regards,
Brian

3 Likes

I just wanted to mention that AWS has a free tier that you can use pretty much indefinitely as long as traffic is low. One of the simplest is launching a small EC2 instance to serve up your model.

3 Likes

Hey! Apparently I am trying to start web dev on Heroku, and I have been encounter a lot of different problems, specifically on the issue of installing libraries on the remote server. Apparently, I’ve been trying to get the water-classifier to work on my implementation in github and the gpu libraries are too big while the cpu libraries are always throwing me errors. Here are some errors:

-----> Python app detected
-----> Installing python-3.6.10
-----> Installing pip
-----> Installing SQLite3
Sqlite3 successfully installed.
-----> Installing requirements with pip
Collecting Flask (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/9b/93/628509b8d5dc749656a9641f4caf13540e2cdec85276964ff8f43bbb1d3b/Flask-1.1.1-py2.py3-none-any.whl (94kB)
Collecting gunicorn (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/69/ca/926f7cd3a2014b16870086b2d0fdc84a9e49473c68a8dff8b57f7c156f43/gunicorn-20.0.4-py2.py3-none-any.whl (77kB)
Collecting numpy (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/62/20/4d43e141b5bc426ba38274933ef8e76e85c7adea2c321ecf9ebf7421cedf/numpy-1.18.1-cp36-cp36m-manylinux1_x86_64.whl (20.1MB)
Collecting torch==1.2.0+cpu (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 4))
Could not find a version that satisfies the requirement torch==1.2.0+cpu (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 4)) (from versions: 0.1.2, 0.1.2.post1, 0.1.2.post2, 0.3.1, 0.4.0, 0.4.1, 1.0.0, 1.0.1, 1.0.1.post2, 1.1.0, 1.2.0, 1.3.0, 1.3.1, 1.4.0)
No matching distribution found for torch==1.2.0+cpu (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt (line 4))
! Push rejected, failed to compile Python app.
! Push failed

Here is my list of libraries in requirements.txt:

Flask
gunicorn
numpy
torch==1.2.0+cpu -f https://download.pytorch.org/whl/torch_stable.html
torchvision==0.4.0+cpu -f https://download.pytorch.org/whl/torch_stable.html
fastai

can anyone help me out here? Thank you!

Hi jfang Hope all is well!

Could not find a version that satisfies the requirement torch==1.2.0+cpu

Have you tried with out the -f option?
Also many people test their app on Docker desktop, before deploying online, as it can be easier to fault find locally than remotely.

Cheers mrfabulous1 :smiley: :smiley:

Unfortunately, -f also threw an error. the -f flag is is part of the installation files as the local machine reads a link from the website :sob: it seems that nothing is working right now with the Pytorch installation; take the gpu version, it’s too big; take the cpu version, it’s throwing errors

Hi jfang Hope your having a jolly day (even if your apps not working!!)

Could not find a version that satisfies the requirement torch==1.2.0+cpu (from -r /tmp/build_f091f8565909c9a9f38076d1351650af/requirements.txt

  1. What is the version of python on your development platform, is it the same as the version on Heroku.com?
    If not this could be an issue as different versions of libraries require different versions of python.

  2. What specific platform did you train your model on? is it a local PC or a cloud provider?
    This is important as the platform you are deploying to may have issues with your training platform.

  3. Your model works, however when trying to deploy it to heroku.com it fails you may get better answers from heroku.com support.

  4. The problems you are experiencing have happened to virtually every person on this platform who has tried to deploy their first model. My first model took me four weeks to deploy and I had roughly 40 different problems that’s why initially I chose render.com because I couldn’t get my model working on other platforms. Docker desktop was invented to avoid these issues but docker can be difficult to configure as you have to solve all the issues you are experiencing now.

  5. I also suggest you fun pip list or pip freeze on you development platform and record all the versions of the libraries you are using and put those versions in your requirements.txt, if not when the current versions are changed/updated it could break your app when you finally deploy it and it is working.

Cheers mrfabulous1 :smiley: :smiley:

Okay, I’ve worked out the installation now (or I think because the packages downloaded without errors).

Here is my requirements.txt file:

Flask
gunicorn
numpy
https://download.pytorch.org/whl/cpu/torch-1.0.0-cp36-cp36m-linux_x86_64.whl
fastai

But I now encounter new problems. When I tried to open the app, it gave me an error response. I am still trying to get around with the water classifier app.

Here is my command line history. If anyone can help me again, that will be greatly appreciated.

(base) jfang@jfang-workstation:~$ heroku logs --app water-classification-exp
2020-02-14T17:50:15.560112+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
2020-02-14T17:50:15.560113+00:00 app[web.1]: File "<frozen importlib._bootstrap_external>", line 678, in exec_module
2020-02-14T17:50:15.560113+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2020-02-14T17:50:15.560113+00:00 app[web.1]: File "/app/app.py", line 13, in <module>
2020-02-14T17:50:15.560114+00:00 app[web.1]: from fastai.vision import *
2020-02-14T17:50:15.560114+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/__init__.py", line 3, in <module>
2020-02-14T17:50:15.560114+00:00 app[web.1]: from .learner import *
2020-02-14T17:50:15.560114+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/learner.py", line 6, in <module>
2020-02-14T17:50:15.560115+00:00 app[web.1]: from . import models
2020-02-14T17:50:15.560115+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/models/__init__.py", line 2, in <module>
2020-02-14T17:50:15.560115+00:00 app[web.1]: from torchvision.models import ResNet,resnet18,resnet34,resnet50,resnet101,resnet152
2020-02-14T17:50:15.560115+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/__init__.py", line 3, in <module>
2020-02-14T17:50:15.560116+00:00 app[web.1]: from torchvision import models
2020-02-14T17:50:15.560116+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/models/__init__.py", line 5, in <module>
2020-02-14T17:50:15.560116+00:00 app[web.1]: from .inception import *
2020-02-14T17:50:15.560116+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/models/inception.py", line 8, in <module>
2020-02-14T17:50:15.560117+00:00 app[web.1]: from torch.jit.annotations import Optional
2020-02-14T17:50:15.560117+00:00 app[web.1]: ImportError: cannot import name 'Optional'
2020-02-14T17:50:15.560118+00:00 app[web.1]: [2020-02-14 17:50:15 +0000] [10] [INFO] Worker exiting (pid: 10)
2020-02-14T17:50:16.018758+00:00 heroku[router]: at=error code=H13 desc="Connection closed without response" method=GET path="/" host=water-classification-exp.herokuapp.com request_id=743451b5-517a-439e-8e36-c8bc01fbebc3 fwd="73.154.95.209" dyno=web.1 connect=1ms service=1609ms status=503 bytes=0 protocol=https
2020-02-14T17:50:16.018866+00:00 heroku[router]: at=error code=H13 desc="Connection closed without response" method=GET path="/" host=water-classification-exp.herokuapp.com request_id=0b4dbac4-4735-448d-81ae-5114c0bb832e fwd="73.154.95.209" dyno=web.1 connect=1ms service=2480ms status=503 bytes=0 protocol=https
2020-02-14T17:50:15.924321+00:00 app[web.1]: [2020-02-14 17:50:15 +0000] [11] [ERROR] Exception in worker process
2020-02-14T17:50:15.924329+00:00 app[web.1]: Traceback (most recent call last):
2020-02-14T17:50:15.924330+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 583, in spawn_worker
2020-02-14T17:50:15.924330+00:00 app[web.1]: worker.init_process()
2020-02-14T17:50:15.924331+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/workers/base.py", line 119, in init_process
2020-02-14T17:50:15.924331+00:00 app[web.1]: self.load_wsgi()
2020-02-14T17:50:15.924334+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/workers/base.py", line 144, in load_wsgi
2020-02-14T17:50:15.924334+00:00 app[web.1]: self.wsgi = self.app.wsgi()
2020-02-14T17:50:15.924335+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/base.py", line 67, in wsgi
2020-02-14T17:50:15.924335+00:00 app[web.1]: self.callable = self.load()
2020-02-14T17:50:15.924336+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/wsgiapp.py", line 49, in load
2020-02-14T17:50:15.924336+00:00 app[web.1]: return self.load_wsgiapp()
2020-02-14T17:50:15.924336+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/wsgiapp.py", line 39, in load_wsgiapp
2020-02-14T17:50:15.924337+00:00 app[web.1]: return util.import_app(self.app_uri)
2020-02-14T17:50:15.924337+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/util.py", line 358, in import_app
2020-02-14T17:50:15.924338+00:00 app[web.1]: mod = importlib.import_module(module)
2020-02-14T17:50:15.924338+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/importlib/__init__.py", line 126, in import_module
2020-02-14T17:50:15.924338+00:00 app[web.1]: return _bootstrap._gcd_import(name[level:], package, level)
2020-02-14T17:50:15.924339+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 994, in _gcd_import
2020-02-14T17:50:15.924341+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 971, in _find_and_load
2020-02-14T17:50:15.924341+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
2020-02-14T17:50:15.924342+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
2020-02-14T17:50:15.924342+00:00 app[web.1]: File "<frozen importlib._bootstrap_external>", line 678, in exec_module
2020-02-14T17:50:15.924342+00:00 app[web.1]: File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2020-02-14T17:50:15.924343+00:00 app[web.1]: File "/app/app.py", line 13, in <module>
2020-02-14T17:50:15.924343+00:00 app[web.1]: from fastai.vision import *
2020-02-14T17:50:15.924344+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/__init__.py", line 3, in <module>
2020-02-14T17:50:15.924344+00:00 app[web.1]: from .learner import *
2020-02-14T17:50:15.924344+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/learner.py", line 6, in <module>
2020-02-14T17:50:15.924345+00:00 app[web.1]: from . import models
2020-02-14T17:50:15.924345+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/fastai/vision/models/__init__.py", line 2, in <module>
2020-02-14T17:50:15.924345+00:00 app[web.1]: from torchvision.models import ResNet,resnet18,resnet34,resnet50,resnet101,resnet152
2020-02-14T17:50:15.924346+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/__init__.py", line 3, in <module>
2020-02-14T17:50:15.924346+00:00 app[web.1]: from torchvision import models
2020-02-14T17:50:15.924347+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/models/__init__.py", line 5, in <module>
2020-02-14T17:50:15.924347+00:00 app[web.1]: from .inception import *
2020-02-14T17:50:15.924348+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/torchvision/models/inception.py", line 8, in <module>
2020-02-14T17:50:15.924348+00:00 app[web.1]: from torch.jit.annotations import Optional
2020-02-14T17:50:15.924348+00:00 app[web.1]: ImportError: cannot import name 'Optional'
2020-02-14T17:50:15.924863+00:00 app[web.1]: [2020-02-14 17:50:15 +0000] [11] [INFO] Worker exiting (pid: 11)
2020-02-14T17:50:16.120527+00:00 app[web.1]: Traceback (most recent call last):
2020-02-14T17:50:16.120536+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 209, in run
2020-02-14T17:50:16.121021+00:00 app[web.1]: self.sleep()
2020-02-14T17:50:16.121026+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 357, in sleep
2020-02-14T17:50:16.121409+00:00 app[web.1]: ready = select.select([self.PIPE[0]], [], [], 1.0)
2020-02-14T17:50:16.121412+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 242, in handle_chld
2020-02-14T17:50:16.121741+00:00 app[web.1]: self.reap_workers()
2020-02-14T17:50:16.121746+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 525, in reap_workers
2020-02-14T17:50:16.122229+00:00 app[web.1]: raise HaltServer(reason, self.WORKER_BOOT_ERROR)
2020-02-14T17:50:16.122278+00:00 app[web.1]: gunicorn.errors.HaltServer: <HaltServer 'Worker failed to boot.' 3>
2020-02-14T17:50:16.122279+00:00 app[web.1]: 
2020-02-14T17:50:16.122279+00:00 app[web.1]: During handling of the above exception, another exception occurred:
2020-02-14T17:50:16.122279+00:00 app[web.1]: 
2020-02-14T17:50:16.122286+00:00 app[web.1]: Traceback (most recent call last):
2020-02-14T17:50:16.122306+00:00 app[web.1]: File "/app/.heroku/python/bin/gunicorn", line 11, in <module>
2020-02-14T17:50:16.122485+00:00 app[web.1]: sys.exit(run())
2020-02-14T17:50:16.122486+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/wsgiapp.py", line 58, in run
2020-02-14T17:50:16.122690+00:00 app[web.1]: WSGIApplication("%(prog)s [OPTIONS] [APP_MODULE]").run()
2020-02-14T17:50:16.122690+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/base.py", line 228, in run
2020-02-14T17:50:16.122999+00:00 app[web.1]: super().run()
2020-02-14T17:50:16.123000+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/app/base.py", line 72, in run
2020-02-14T17:50:16.123199+00:00 app[web.1]: Arbiter(self).run()
2020-02-14T17:50:16.123219+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 229, in run
2020-02-14T17:50:16.123487+00:00 app[web.1]: self.halt(reason=inst.reason, exit_status=inst.exit_status)
2020-02-14T17:50:16.123488+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 342, in halt
2020-02-14T17:50:16.123868+00:00 app[web.1]: self.stop()
2020-02-14T17:50:16.123869+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 393, in stop
2020-02-14T17:50:16.124294+00:00 app[web.1]: time.sleep(0.1)
2020-02-14T17:50:16.124299+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 242, in handle_chld
2020-02-14T17:50:16.124597+00:00 app[web.1]: self.reap_workers()
2020-02-14T17:50:16.124602+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/gunicorn/arbiter.py", line 525, in reap_workers
2020-02-14T17:50:16.125062+00:00 app[web.1]: raise HaltServer(reason, self.WORKER_BOOT_ERROR)
2020-02-14T17:50:16.125095+00:00 app[web.1]: gunicorn.errors.HaltServer: <HaltServer 'Worker failed to boot.' 3>
2020-02-14T17:50:16.225835+00:00 heroku[web.1]: State changed from up to crashed
2020-02-14T17:50:16.203119+00:00 heroku[web.1]: Process exited with status 1
2020-02-14T17:50:21.316644+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=water-classification-exp.herokuapp.com request_id=d58bbadd-5790-4ca2-b7e9-299cfa3b0092 fwd="73.154.95.209" dyno=web.1 connect=5001ms service= status=503 bytes= protocol=https
2020-02-14T17:50:32.000000+00:00 app[api]: Build succeeded
2020-02-14T17:53:00.265375+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=water-classification-exp.herokuapp.com request_id=e407c83d-5161-447b-b000-3ff22bcf81b5 fwd="73.154.95.209" dyno= connect= service= status=503 bytes= protocol=https
2020-02-14T17:53:00.575875+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=water-classification-exp.herokuapp.com request_id=9ee7f427-17fb-4caf-99db-fbe272eff3df fwd="73.154.95.209" dyno= connect= service= status=503 bytes= protocol=https
(base) jfang@jfang-workstation:~$

Here was my Heroku Activity Feed (from installation of packages to deployment):

-----> Python app detected
-----> Need to update SQLite3, clearing cache
-----> Installing python-3.6.10
-----> Installing pip
-----> Installing SQLite3
Sqlite3 successfully installed.
-----> Installing requirements with pip
       Collecting Flask (from -r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/9b/93/628509b8d5dc749656a9641f4caf13540e2cdec85276964ff8f43bbb1d3b/Flask-1.1.1-py2.py3-none-any.whl (94kB)
       Collecting gunicorn (from -r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 2))
         Downloading https://files.pythonhosted.org/packages/69/ca/926f7cd3a2014b16870086b2d0fdc84a9e49473c68a8dff8b57f7c156f43/gunicorn-20.0.4-py2.py3-none-any.whl (77kB)
       Collecting numpy (from -r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 3))
         Downloading https://files.pythonhosted.org/packages/62/20/4d43e141b5bc426ba38274933ef8e76e85c7adea2c321ecf9ebf7421cedf/numpy-1.18.1-cp36-cp36m-manylinux1_x86_64.whl (20.1MB)
       Collecting torch==1.0.0 from https://download.pytorch.org/whl/cpu/torch-1.0.0-cp36-cp36m-linux_x86_64.whl (from -r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 4))
         Downloading https://download.pytorch.org/whl/cpu/torch-1.0.0-cp36-cp36m-linux_x86_64.whl (69.4MB)
       Collecting fastai (from -r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/f5/e4/a7025bf28f303dbda0f862c09a7f957476fa92c9271643b4061a81bb595f/fastai-1.0.60-py3-none-any.whl (237kB)
       Collecting Jinja2>=2.10.1 (from Flask->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/27/24/4f35961e5c669e96f6559760042a55b9bcfcdb82b9bdb3c8753dbe042e35/Jinja2-2.11.1-py2.py3-none-any.whl (126kB)
       Collecting Werkzeug>=0.15 (from Flask->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/ba/a5/d6f8a6e71f15364d35678a4ec8a0186f980b3bd2545f40ad51dd26a87fb1/Werkzeug-1.0.0-py2.py3-none-any.whl (298kB)
       Collecting click>=5.1 (from Flask->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/fa/37/45185cb5abbc30d7257104c434fe0b07e5a195a6847506c074527aa599ec/Click-7.0-py2.py3-none-any.whl (81kB)
       Collecting itsdangerous>=0.24 (from Flask->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/76/ae/44b03b253d6fade317f32c24d100b3b35c2239807046a4c953c7b89fa49e/itsdangerous-1.1.0-py2.py3-none-any.whl
       Collecting dataclasses; python_version < "3.7" (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/e1/d2/6f02df2616fd4016075f60157c7a0452b38d8f7938ae94343911e0fb0b09/dataclasses-0.7-py3-none-any.whl
       Collecting requests (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)
       Collecting scipy (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/dc/29/162476fd44203116e7980cfbd9352eef9db37c49445d1fec35509022f6aa/scipy-1.4.1-cp36-cp36m-manylinux1_x86_64.whl (26.1MB)
       Collecting beautifulsoup4 (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/cb/a1/c698cf319e9cfed6b17376281bd0efc6bfc8465698f54170ef60a485ab5d/beautifulsoup4-4.8.2-py3-none-any.whl (106kB)
       Collecting matplotlib (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/7e/07/4b361d6d0f4e08942575f83a11d33f36897e1aae4279046606dd1808778a/matplotlib-3.1.3-cp36-cp36m-manylinux1_x86_64.whl (13.1MB)
       Collecting fastprogress>=0.2.1 (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/41/67/347d73405b8612e436a4278f577186a8b783fe757df549ba1a82a2986727/fastprogress-0.2.2-py3-none-any.whl
       Collecting Pillow (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/19/5e/23dcc0ce3cc2abe92efd3cd61d764bee6ccdf1b667a1fb566f45dc249953/Pillow-7.0.0-cp36-cp36m-manylinux1_x86_64.whl (2.1MB)
       Collecting bottleneck (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/62/d0/55bbb49f4fade3497de2399af70ec0a06e432c786b8623c878b11e90d456/Bottleneck-1.3.1.tar.gz (88kB)
       Collecting pyyaml (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/3d/d9/ea9816aea31beeadccd03f1f8b625ecf8f645bd66744484d162d84803ce5/PyYAML-5.3.tar.gz (268kB)
       Collecting torchvision (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/7e/90/6141bf41f5655c78e24f40f710fdd4f8a8aff6c8b7c6f0328240f649bdbe/torchvision-0.5.0-cp36-cp36m-manylinux1_x86_64.whl (4.0MB)
       Collecting nvidia-ml-py3 (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/6d/64/cce82bddb80c0b0f5c703bbdafa94bfb69a1c5ad7a79cff00b482468f0d3/nvidia-ml-py3-7.352.0.tar.gz
       Collecting numexpr (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/36/ed/eac5f6123f54a61cd13b7e89826b97edea54adf76d9f8e9fa2ce70e2fdf8/numexpr-2.7.1-cp36-cp36m-manylinux1_x86_64.whl (162kB)
       Collecting spacy>=2.0.18 (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/47/13/80ad28ef7a16e2a86d16d73e28588be5f1085afd3e85e4b9b912bd700e8a/spacy-2.2.3-cp36-cp36m-manylinux1_x86_64.whl (10.4MB)
       Collecting packaging (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/98/42/87c585dd3b113c775e65fd6b8d9d0a43abe1819c471d7af702d4e01e9b20/packaging-20.1-py2.py3-none-any.whl
       Collecting pandas (from fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/08/ec/b5dd8cfb078380fb5ae9325771146bccd4e8cad2d3e4c72c7433010684eb/pandas-1.0.1-cp36-cp36m-manylinux1_x86_64.whl (10.1MB)
       Collecting MarkupSafe>=0.23 (from Jinja2>=2.10.1->Flask->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 1))
         Downloading https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
       Collecting idna<2.9,>=2.5 (from requests->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)
       Collecting certifi>=2017.4.17 (from requests->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/b9/63/df50cac98ea0d5b006c55a399c3bf1db9da7b5a24de7890bc9cfd5dd9e99/certifi-2019.11.28-py2.py3-none-any.whl (156kB)
       Collecting chardet<3.1.0,>=3.0.2 (from requests->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
       Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/e8/74/6e4f91745020f967d09332bb2b8b9b10090957334692eb88ea4afe91b77f/urllib3-1.25.8-py2.py3-none-any.whl (125kB)
       Collecting soupsieve>=1.2 (from beautifulsoup4->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/81/94/03c0f04471fc245d08d0a99f7946ac228ca98da4fa75796c507f61e688c2/soupsieve-1.9.5-py2.py3-none-any.whl
       Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/5d/bc/1e58593167fade7b544bfe9502a26dc860940a79ab306e651e7f13be68c2/pyparsing-2.4.6-py2.py3-none-any.whl (67kB)
       Collecting kiwisolver>=1.0.1 (from matplotlib->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/f8/a1/5742b56282449b1c0968197f63eae486eca2c35dcd334bab75ad524e0de1/kiwisolver-1.1.0-cp36-cp36m-manylinux1_x86_64.whl (90kB)
       Collecting python-dateutil>=2.1 (from matplotlib->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl (227kB)
       Collecting cycler>=0.10 (from matplotlib->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl
       Collecting six (from torchvision->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/65/eb/1f97cb97bfc2390a276969c6fae16075da282f5058082d4cb10c6c5c1dba/six-1.14.0-py2.py3-none-any.whl
       Collecting thinc<7.4.0,>=7.3.0 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/07/59/6bb553bc9a5f072d3cd479fc939fea0f6f682892f1f5cff98de5c9b615bb/thinc-7.3.1-cp36-cp36m-manylinux1_x86_64.whl (2.2MB)
       Collecting preshed<3.1.0,>=3.0.2 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/db/6b/e07fad36913879757c90ba03d6fb7f406f7279e11dcefc105ee562de63ea/preshed-3.0.2-cp36-cp36m-manylinux1_x86_64.whl (119kB)
       Collecting blis<0.5.0,>=0.4.0 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/41/19/f95c75562d18eb27219df3a3590b911e78d131b68466ad79fdf5847eaac4/blis-0.4.1-cp36-cp36m-manylinux1_x86_64.whl (3.7MB)
       Collecting murmurhash<1.1.0,>=0.28.0 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/a6/e6/63f160a4fdf0e875d16b28f972083606d8d54f56cd30cb8929f9a1ee700e/murmurhash-1.0.2-cp36-cp36m-manylinux1_x86_64.whl
       Collecting catalogue<1.1.0,>=0.0.7 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/6c/f9/9a5658e2f56932e41eb264941f9a2cb7f3ce41a80cb36b2af6ab78e2f8af/catalogue-1.0.0-py2.py3-none-any.whl
       Collecting cymem<2.1.0,>=2.0.2 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/e7/b5/3e1714ebda8fd7c5859f9b216e381adc0a38b962f071568fd00d67e1b1ca/cymem-2.0.3-cp36-cp36m-manylinux1_x86_64.whl
       Collecting plac<1.2.0,>=0.9.6 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/86/85/40b8f66c2dd8f4fd9f09d59b22720cffecf1331e788b8a0cab5bafb353d1/plac-1.1.3-py2.py3-none-any.whl
       Collecting srsly<1.1.0,>=0.1.0 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/4f/96/3350d3fa0cfa2b2ff341113d60b5bfe0ab8dd0e6b6b2c8b12157b4eb3000/srsly-1.0.1-cp36-cp36m-manylinux1_x86_64.whl (185kB)
       Collecting wasabi<1.1.0,>=0.4.0 (from spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/21/e1/e4e7b754e6be3a79c400eb766fb34924a6d278c43bb828f94233e0124a21/wasabi-0.6.0-py3-none-any.whl
       Collecting pytz>=2017.2 (from pandas->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl (509kB)
       Collecting tqdm<5.0.0,>=4.10.0 (from thinc<7.4.0,>=7.3.0->spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/cd/80/5bb262050dd2f30f8819626b7c92339708fe2ed7bd5554c8193b4487b367/tqdm-4.42.1-py2.py3-none-any.whl (59kB)
       Collecting importlib-metadata>=0.20; python_version < "3.8" (from catalogue<1.1.0,>=0.0.7->spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/8b/03/a00d504808808912751e64ccf414be53c29cad620e3de2421135fcae3025/importlib_metadata-1.5.0-py2.py3-none-any.whl
       Collecting zipp>=0.5 (from importlib-metadata>=0.20; python_version < "3.8"->catalogue<1.1.0,>=0.0.7->spacy>=2.0.18->fastai->-r /tmp/build_e40f577a44c4d3bdc774221cc050554c/requirements.txt (line 5))
         Downloading https://files.pythonhosted.org/packages/46/42/f2dd964b2a6b1921b08d661138148c1bcd3f038462a44019416f2342b618/zipp-2.2.0-py36-none-any.whl
       Installing collected packages: MarkupSafe, Jinja2, Werkzeug, click, itsdangerous, Flask, gunicorn, numpy, torch, dataclasses, idna, certifi, chardet, urllib3, requests, scipy, soupsieve, beautifulsoup4, pyparsing, kiwisolver, six, python-dateutil, cycler, matplotlib, fastprogress, Pillow, bottleneck, pyyaml, torchvision, nvidia-ml-py3, numexpr, wasabi, blis, murmurhash, plac, cymem, tqdm, srsly, preshed, thinc, zipp, importlib-metadata, catalogue, spacy, packaging, pytz, pandas, fastai
         Running setup.py install for bottleneck: started
           Running setup.py install for bottleneck: finished with status 'done'
         Running setup.py install for pyyaml: started
           Running setup.py install for pyyaml: finished with status 'done'
         Running setup.py install for nvidia-ml-py3: started
           Running setup.py install for nvidia-ml-py3: finished with status 'done'
       Successfully installed Flask-1.1.1 Jinja2-2.11.1 MarkupSafe-1.1.1 Pillow-7.0.0 Werkzeug-1.0.0 beautifulsoup4-4.8.2 blis-0.4.1 bottleneck-1.3.1 catalogue-1.0.0 certifi-2019.11.28 chardet-3.0.4 click-7.0 cycler-0.10.0 cymem-2.0.3 dataclasses-0.7 fastai-1.0.60 fastprogress-0.2.2 gunicorn-20.0.4 idna-2.8 importlib-metadata-1.5.0 itsdangerous-1.1.0 kiwisolver-1.1.0 matplotlib-3.1.3 murmurhash-1.0.2 numexpr-2.7.1 numpy-1.18.1 nvidia-ml-py3-7.352.0 packaging-20.1 pandas-1.0.1 plac-1.1.3 preshed-3.0.2 pyparsing-2.4.6 python-dateutil-2.8.1 pytz-2019.3 pyyaml-5.3 requests-2.22.0 scipy-1.4.1 six-1.14.0 soupsieve-1.9.5 spacy-2.2.3 srsly-1.0.1 thinc-7.3.1 torch-1.0.0 torchvision-0.5.0 tqdm-4.42.1 urllib3-1.25.8 wasabi-0.6.0 zipp-2.2.0
-----> Discovering process types
       Procfile declares types -> web
-----> Compressing...
       Done: 215.4M
-----> Launching...
       Released v4
       https://water-classification-exp.herokuapp.com/ deployed to Heroku

Okay, it looks as if I have resolved this last problem; now I face a new problem. How can I load the model onto Fastai? It seems that the model is too big to be uploaded onto Github. Is there a way for me to load a model remotely onto Heroku?

Thanks

But I have heard Heroku has a sleeping issues. I don’t really know what it is. Do you have any issue with Heroku?

I am still trying to figure out things with Heroku; now I have a memory error, I am considering porting my model to Google Colab to process…

Hi jfang glad to see that you are persevering and hope your are having a a jolly day.

Sometimes when I am having severe difficulties deploying an app and I need to show a few people for a day or two who I trust I use Ngrok.

https://www.softwaretestinghelp.com/ngrok-alternatives/it.

Ngrok allows me to deploy it temporarily while I solve the deployment issues.

If you do get your app working it would definitely be worth writing a blog, I am sure many people would find it useful. Many people are funding their own learning to do AI and are not making a penny/cent and even $10 dollars a month is expensive. Many of the models in this forum I believe meet this case as they disappear after a month or so.

Cheers mrfabulous1 :smiley: :smiley: