First of all, if you’re starting your journey in Deep Learning, make sure you use a platform that allows you to spend time experimenting and learning, not configuring machines, running instal scripts or assembling hardware.
That being said, if you want to use Docker as a platform, I’m sharing the container I am using and intend to keep up to date throughout the course. You can find it and the instructions here:
Could you create a quick tutorial and explaination, I am a huge fan of Docker, but still I am not confident to set it up. What kind of resources will it use? If we will use a local VM on our PC I guess it will still use our GPU correct?
as I mentioned, if you are not a regular Docker user, you might be better off with Paperspace or Google Colab, which are probably easier to set up. The prime goal should be focus on getting as much out of the course as possible. (There’s always time to learn Docker )
Awesome,
So it seems that you need a good computer. As soon as I will have a good laptop I will do that for sure. Thank you so much @zerotosingularity
Best Regards
In general I would not suggest a laptop for deep learning. Desktops in general have better thermal properties, which allow the GPUs to continually run at full tilt hours on end.
Better to setup your jupyter notebooks to be accessed over ssh.
(The final comment on the last link seems to suggest otherwise, but the Tensorflow Docker documentation does not provide more details on using Windows)
I have updated the container documentation accordingly.
nbconvert failed: xelatex not found on PATH, if you have not installed xelatex you may need to do so. Find further instructions at https://nbconvert.readthedocs.io/en/latest/install.html#installing-tex. error.
Missing Graphviz
Graphviz also crashes asking if installed.
`conda install python-graphviz`
manually run fixes it, so this could be added as well. It is needed for running the gv code in the first notebook for the course.
I try not to put data in the container, but load it as a volume after… The container “forgets” all changes once it restarts, so data is preferably kept out. My 2 cents
Seems to solve the xelatex error, but be warned this way oversolves the problem and is around 4gb. Probably something similar to texlive-xelatex as a package would solve this but for anyone following this thread too, this will add pdf and tex output support.
Awesome! Thank you for looking into that. If it’s adding 4Gb to the image, I hesitant to add it by default. For those who really need to print, your install it pretty easy…
Thanks to zerotosingularity for creating this thread and uploading fastai2 docker containers. I’ve tried his docker container https://hub.docker.com/repository/docker/zerotosingularity/fastai2 and can confirm that it’s working for course-v4. His docker container starts up jupyter notebook whose root directory is some volume external to the docker container.
I’ve also been able to create a docker container which runs course-v4. I’m new to docker so approached the problem by writing a bunch of shell scripts. The scripts are a work-in-progress.
Goals:
Scripts will allow easy installation of fastai2 onto an Ubuntu system.
Scripts will enable quick and easy changes to configs (docker, conda, pip, fastai, my code) without messing up the base system.
The docker container will have GPU access and similar performance to native.
The docker container will have rwx access to my development volume allowing easy development from within and between docker containers.
Any project work done within a container will be easy to backup.
The docker container uses my userid and group for seamless non-root permissions access.
The scripts will be serially re-runable and make minimal assumptions about state.
Docker related scripts will eventually be made into a dockerfile.
Run the container changing the number of gb of memory you want to direct to it and replacing username here with your linux username (type pwd if you are not sure): docker run --gpus all --shm-size 24g --ulimit memlock=-1 -p 8888:8888 -v /home/YOUR_USERNAME_HERE/fastainotebooks:/workspace/coursenbs fastdotai/fastai ./run_jupyter.sh
You should now be able to access the course notebooks in your container and any edits you make to the notebooks will be saved back to your volume. Note that you have to clone the git notebooks, because the official containers contain the previous courses notebooks, but it’s good to have the local notebooks on a volume on your machine anyways, so that any edits you make will be saved.
You can also save the run command in an sh file by saving the command on the second line of a text file with #!/bin/bash on the first line and saving the file as sh, and then you can just run the sh instead of having to run the entire large docker command.
Hi @zerotosingularity if i use ubuntu on my laptop (i have an integrated intel graphic card), does the docker give access to the gpu? or do we need to have the gpu on machine?