Local fastai docker containers - paperspace - seeme.ai etc

Let’s use this thread to discuss running fastai on your local machine using docker containers.

I just noticed that paperspace has released the latest build of their fastai docker container. I used their container for part 1 of the course and it worked quite well. I’ve been trying to use the seeme.ai container for part 2 as it was “newer” but ran into problems with some initial success (CUDA errors). So I’m going to try the paperspace container on my DL box at home, and see if I have better luck. The idea for me is to avoid dealing with a local install on my own (i.e., “let someone else do the hard work” :joy: )

Here is their latest tag (October 13th, 2022):


I will report back to this thread after I’ve pulled and run the diffusion notbooks using this container.


I’ve been using a simple Docker file on an aws ec2 instance which has one of the
deep learning base AMIs so all the cuda stuff is installed.

GPU type and Cuda version:

CUDA Version: 11.6

Docker File

FROM pytorch/pytorch
WORKDIR /workspace

RUN apt-get update && apt-get install -y \
        git \
        wget \
        htop \
        screen \
        vim \

COPY . /workspace

RUN pip install -r requirements.txt



Building Docker Image

docker build . -t fast-ai

Run Docker

docker run -it -p 8888:8888 --gpus all -v $PWD:/workspace fast-ai /bin/bash

In Container shell

jupyter notebook --port=8888 --no-browser --ip= --allow-root

This is worked well for running some of the lesson 9 notebooks and doing inference.


Thanks for sharing Chris, What is the above command in the dockerfile doing?
is it the diffusion_nbs cloned repo that is being copied to the docker image /workspace directory?

Sorry I don’t understand these in detail.

I’m going to try your approach as I’m not having much luck with paperspace/seemai images atm :sweat_smile:

It just copies every thing in the folder next to the Docker file into that location on the docker container. You probably already know that hehe. I was actually running git clone within the docker container after I got it running. But I think that git clone could just be executed in the Docker file too. I don’t think this line is 100% necessary TBH but I just like including it.

1 Like

Ah ok thanks! yeah I just ran it in the diffusion_nbs cloned directory and it copied everything over :slight_smile: . I asked because I usually just map the cloned directory onto the docker container and use it from inside it. The thing with that is that I’m having a heck of a time docker changing owner/group to root/root for every file it touches so I have to have extra lines in the launcher script to reset the permissions before I fire up the docker container, but your use case is slightly different so it makes sense!

I spent the whole day trying to get different containers working and it turns out docker was changing owner/group perms on my filesystem so nothing was working and I was getting CUDA errors. Time for me to spend those generous credits lambda labs gave out the other day :wink: … the amount of fiddling I have to do trying to run it locally on a janky 1070ti is just too much for the returns I get.

Yea I just also just fired up lambdalabs for the first time to try it out.
The docker file and everything above also worked. But I had to add sudo in front of the docker commands to get it working.

1 Like

Hi Mike,

I’ve just found this thread - I’m the founder of SeeMe.ai - and happy to help or answer questions…

We publish docker containers for every fast.ai version: Docker but I can quickly spin up a custom version if you need more libraries installed…

Happy to help!

1 Like

This may be WAY more than anyone would want, but I built a docker image based on the jupyter studio image that provided a really quick way to get up and running to learn the material.

So far this is working like a champ on a fresh Ubuntu 22.04 install of Linux. I was able to validate that it was correctly accessing my GPU.

Note that the image is beefy at about 10GB.