Platform: Docker (Free; non-beginner)

First of all, if you’re starting your journey in Deep Learning, make sure you use a platform that allows you to spend time experimenting and learning, not configuring machines, running instal scripts or assembling hardware.

That being said, if you want to use Docker as a platform, I’m sharing the container I am using and intend to keep up to date throughout the course. You can find it and the instructions here:

Fast.ai v2 Docker

Then all you need to do is, clone the Course repo and you are ready to go:

git clone https://github.com/fastai/course-v4

6 Likes

Could you create a quick tutorial and explaination, I am a huge fan of Docker, but still I am not confident to set it up. What kind of resources will it use? If we will use a local VM on our PC I guess it will still use our GPU correct?

Hi Albertotono,

as I mentioned, if you are not a regular Docker user, you might be better off with Paperspace or Google Colab, which are probably easier to set up. The prime goal should be focus on getting as much out of the course as possible. (There’s always time to learn Docker :slight_smile:)

That being said, on the container page, the instructions that should get you up and running are available in the description: https://hub.docker.com/u/zerotosingularity/fastai2

In brief summary:

  1. Install Docker
  2. Install nvidia-docker
  3. start the container

I haven’t created a special tutorial, as the instructions vary per platform, and you don’t need to do any other that following those.

Hope that helps, if not, I’m around :smiley:

1 Like

Awesome,
So it seems that you need a good computer. As soon as I will have a good laptop I will do that for sure. Thank you so much @zerotosingularity
Best Regards

Hi @zerotosingularity If Docker is used with my Windows 10 64-bit machine, will the GPU accessible?

In general I would not suggest a laptop for deep learning. Desktops in general have better thermal properties, which allow the GPUs to continually run at full tilt hours on end.

Better to setup your jupyter notebooks to be accessed over ssh.

2 Likes

Hi Joseph,

I have looked at the FAQs from NVidia-docker and Windows is not supported.

Searching the Docker forums, it seems that in general there is no GPU access from the container (not tested)

(The final comment on the last link seems to suggest otherwise, but the Tensorflow Docker documentation does not provide more details on using Windows)

I have updated the container documentation accordingly.

1 Like

@jcatanza i dont believe so, at least not with a linux container. https://docs.microsoft.com/en-us/virtualization/windowscontainers/deploy-containers/gpu-acceleration#run-a-container-with-gpu-acceleration . i spent some time looking at something kind of similar, wsl, until realizing it also does not support the gpu with linux.

1 Like

Fastai 0.0.15 is now available:
https://hub.docker.com/repository/docker/zerotosingularity/fastai2

@zerotosingularity - Awesome! Thank you for putting this together!

I think this could be put into the docker as a command on first boot or to have it pull the update on each boot in bash_rc or similar.

Also, is this image pulled from GitHub too? What is the link for the script code?

1 Like

Missing xelatex

I can’t print to pdf. I get a

nbconvert failed: xelatex not found on PATH, if you have not installed xelatex you may need to do so. Find further instructions at https://nbconvert.readthedocs.io/en/latest/install.html#installing-tex. error.

Missing Graphviz

Graphviz also crashes asking if installed.

`conda install python-graphviz`

manually run fixes it, so this could be added as well. It is needed for running the gv code in the first notebook for the course.

1 Like

Very welcome.

I try not to put data in the container, but load it as a volume after… The container “forgets” all changes once it restarts, so data is preferably kept out. My 2 cents :slight_smile:

The script is not public yet, so far…

I’ll try to find some time to look at it and keep you posted.

sudo apt update
sudo apt install texlive-full

Seems to solve the xelatex error, but be warned this way oversolves the problem and is around 4gb. Probably something similar to texlive-xelatex as a package would solve this but for anyone following this thread too, this will add pdf and tex output support.

1 Like

Awesome! Thank you for looking into that. If it’s adding 4Gb to the image, I hesitant to add it by default. For those who really need to print, your install it pretty easy…

Thanks!

Thanks to zerotosingularity for creating this thread and uploading fastai2 docker containers. I’ve tried his docker container https://hub.docker.com/repository/docker/zerotosingularity/fastai2 and can confirm that it’s working for course-v4. His docker container starts up jupyter notebook whose root directory is some volume external to the docker container.

I’ve also been able to create a docker container which runs course-v4. I’m new to docker so approached the problem by writing a bunch of shell scripts. The scripts are a work-in-progress.

Goals:

  1. Scripts will allow easy installation of fastai2 onto an Ubuntu system.
  2. Scripts will enable quick and easy changes to configs (docker, conda, pip, fastai, my code) without messing up the base system.
  3. The docker container will have GPU access and similar performance to native.
  4. The docker container will have rwx access to my development volume allowing easy development from within and between docker containers.
  5. Any project work done within a container will be easy to backup.
  6. The docker container uses my userid and group for seamless non-root permissions access.
  7. The scripts will be serially re-runable and make minimal assumptions about state.
  8. Docker related scripts will eventually be made into a dockerfile.

The work-in-progress scripts are at https://github.com/BSalita/docker_ubuntu_fastai2

I just published my script to deploy fastai2 image classification models with docker. Please let me know if there are things than can be improved :slight_smile: .

2 Likes

I can’t seem to edit the original post, but there are official Docker images available now. You can find them here:

https://hub.docker.com/u/fastdotai

We’ll keep our images available so feel free to use them if you like:

https://hub.docker.com/u/seemeai

1 Like

This is how I setup the course documents with Docker on Ubuntu using the official docker images. Using volumes so that my work is saved:

Note: You should already have installed nvidia compatible docker, nvidia drivers, and git should be available.

  1. Create directory in my root folder called fastainotebooks: mkdir fastainotebooks
  2. Change directory into the notebooks folder: cd fastainotebooks
  3. Use git clone to create the basic notebooks: git clone https://github.com/fastai/course-v4.git
  4. Use git clone to create the complete notebooks: git clone https://github.com/fastai/fastbook.git
  5. Run the container changing the number of gb of memory you want to direct to it and replacing username here with your linux username (type pwd if you are not sure): docker run --gpus all --shm-size 24g --ulimit memlock=-1 -p 8888:8888 -v /home/YOUR_USERNAME_HERE/fastainotebooks:/workspace/coursenbs fastdotai/fastai ./run_jupyter.sh

You should now be able to access the course notebooks in your container and any edits you make to the notebooks will be saved back to your volume. Note that you have to clone the git notebooks, because the official containers contain the previous courses notebooks, but it’s good to have the local notebooks on a volume on your machine anyways, so that any edits you make will be saved.

You can also save the run command in an sh file by saving the command on the second line of a text file with #!/bin/bash on the first line and saving the file as sh, and then you can just run the sh instead of having to run the entire large docker command.

2 Likes

Hi @zerotosingularity if i use ubuntu on my laptop (i have an integrated intel graphic card), does the docker give access to the gpu? or do we need to have the gpu on machine?