Which should I use? We recommend Paperspace, since they’ve got everything customized and set up for this course. They have a good free option, for $8/month you get plenty of extra space, and it’s a totally standard Jupyter setup so everything should “just work”. Colab is also a good free option, especially if Paperspace free instances aren’t available, but it’s a bit more fiddly.
What about the Local options? For users with their own equipment, the Local* options can be used in place of the cloud based options. These setups are meant for non-beginners and for those who can quickly troubleshoot installations. Remember this course is about deep learning, not system configuration setups and compatibility issues.
@vijaysai I’d recommend avoiding it. Nothing has been tested on that. But if you’re an expert, then you might be able to get things working with plenty of debugging and fixing.
@jeremy re AWS/Sagemaker – I saw there were Cloudformation scripts from last year to start a Notebook instanstance in Sagemaker. Will there be seperate CF Stacks to launch this years jypyter notebooks ?
Thanks @jeremy I can try the last year’s CF Stack script. Couple questions:
They used ml.p2.xlarge as the default instance. It has an Nvidia TeslaK80 GPU with 12GB memory. Will that be good enough for this year’s course ?
They used conda instead of pip in the CF script to install the fastfai library like so conda install -y fastai -c fastai. For this year’s course, can we use conda install -y fastai2 -c fastai2 ?
If you are interested I have set up cuda/cudnn based images of all many of the jupyterhub images, which are very convenient setups, and I have added a pytorch one as well ( docker pull quay.io/utilitywarehouse/jupyter-pytorch-notebook if you want to use it) they can be used as easy bases for fastai docker images
In looking at the docker site for your container, you recommend installing cuda 10.2 first. Is this necessary as pytorch brings in the cuda stuff it needs? I am looking to run this container within UNRAID and would rather not install cuda on the system if I don’t have to.
My hope is that your container will recognize both of my GPUs. As of now, I can run fastaiv2 within an Ubuntu VM with GPU passthrough for one card only. I would like to have both in the oft chance that some of the libraries can parallelize.
Hello I’ve had some issues while doing the setup it seems there seems to be something going on with the cloning of the repository in Paperspace Gradient. I’ve got some issues when cloning the repo the fix was to instead of using the ssh command to clone use https to clone it.
To complete the first two notebooks, you’ll also have to install these:
RUN pip install nbdev
RUN pip install graphviz
RUN pip install azure
RUN pip install azure-cognitiveservices-vision-computervision
RUN pip install azure-cognitiveservices-search-websearch
RUN pip install azure-cognitiveservices-search-imagesearch
RUN pip install "ipywidgets>=7.5.1"
RUN pip install sentencepiece
RUN pip install scikit_learn