Paperspace has a Dockerfile for fastai on https://github.com/Paperspace/fastai-docker. If you have nvidia docker, you can get started immediately with:
sudo docker run --runtime=nvidia -d -p 8888:8888 paperspace/fastai:cuda9_pytorch0.3.0
Any code you edit in the container however will be in the container’s file system, and will disappear if you delete the container without moving it out first. If you’d like to keep your code and data on your main file system you can follow my setup. You can also edit Paperspace’s Dockerfile accordingly to allow for this setup.
I haven’t experienced any perceivable issues, but I haven’t investigated Docker’s interaction with CUDA to any detail. I’m using NVIDIA’s CUDA 9.0 image as a base image, which might be helping. @jeremy Could you elaborate on these issues? Maybe I’m suffering performance losses from Docker without realizing it.
The beginning of the README
2018-01-12: The Docker image works with the lesson1 notebook. It’s untested for other notebooks.
To not let dependencies slow you or anyone else down.
- If you make a mistake while sorting out a mess of dependencies, you can just delete the Docker container and start from a fresh one; as opposed to trying to undo it on your operating system.
- Once you sort out a mess of dependencies, you’ll never have to do it again. Even if you install a new operating system or move to a new computer, you can quickly recover your environment by downloading the corresponding Docker image or Dockerfile.
- Also, no one else will have to go through that mess, because you can send them the Docker image or Dockerfile.
For more information, check out this Docker tutorial for data science.
- You have a machine with an NVIDIA GPU in it.
- Your machine is running Ubuntu.
- nvidia docker only works on Linux, unfortunately.
- You’ve installed an NVIDIA driver.
- You’ve installed docker.
- You’ve installed nvidia docker.
How to use the Docker image
If you’ve followed the setup:
fastaiinto a terminal.
j8into the container’s terminal that popped up as a result.
A Jupyter server will now be running in a fastai environment with all of fastai’s dependencies.