Lesson 3 - Official Topic

Actually I think if you use the experimental instructions here you can get it to work:

(see here)

So viola runs locally it seems?

Itā€™s in the fastai2 repo.

1 Like

Can you host voila it in github?

1 Like

Streamlit is also really cool in getting things done fast.

how can we make voila work on the paperspace?
As in, it wont really work by changing the URL of the paperspace hosted notebook.

We will see the deployment in a minute.

2 Likes

Anything about security issues using iPyWidgets / Voila on production ?

2 Likes

I have, and can testify that itā€™s easy and quick. This article offers a walk through for a simple dashboard hosted using heroku. https://towardsdatascience.com/quickly-build-and-deploy-an-application-with-streamlit-988ca08c7e83

4 Likes

Sure you could, but how many images do you think it would take to capture ā€œnot-bearnessā€? Interesting experiment to try. My bet would be that it would be a deal-breakingly large numberā€¦

You will have to set up Colab differently

1 Like

This is awesome, I also didnā€™t know the connection with vue.js and jupyter

Definitely. Streamlit, so far, is pretty solid.

If I am having difficulty running one of the .nbs, should I start a new post or post to an existing thread? I previously could run fastai last year on this machine, but it says now it doesnā€™t have enough memory on the GPU for 01_intro this year. Basically just firefox and the docker are running. nvidia-smi shows basically nothing else running.

Running in LXD docker natively with GPU passthrough to a GTX 1080 with ubuntu 18 lts host.

Thanks

Yep I am not sure eitherā€¦

1 Like

Use a new thread please. You should adjust your batch size as necessary if you have memory issues.

Additional note on CPU vs. GPU for deployment:

You might want to build and deploy a desktop application in an enterprise environment. Basically a big use case believe it or not, especially for my customers. :slight_smile:

Dilbert and friends donā€™t have GPUs on their PCs.

2 Likes

If you do need to deploy directly on a phone, you can use PyTorch Mobile as well.

3 Likes

Yes, this is not easy and itā€™s the drawback with Classification in DL today. Hence my question.

1 Like