Let’s use this thread to discuss running fastai on your local machine using docker containers.
I just noticed that paperspace has released the latest build of their fastai docker container. I used their container for part 1 of the course and it worked quite well. I’ve been trying to use the seeme.ai container for part 2 as it was “newer” but ran into problems with some initial success (CUDA errors). So I’m going to try the paperspace container on my DL box at home, and see if I have better luck. The idea for me is to avoid dealing with a local install on my own (i.e., “let someone else do the hard work” )
It just copies every thing in the folder next to the Docker file into that location on the docker container. You probably already know that hehe. I was actually running git clone within the docker container after I got it running. But I think that git clone could just be executed in the Docker file too. I don’t think this line is 100% necessary TBH but I just like including it.
Ah ok thanks! yeah I just ran it in the diffusion_nbs cloned directory and it copied everything over . I asked because I usually just map the cloned directory onto the docker container and use it from inside it. The thing with that is that I’m having a heck of a time docker changing owner/group to root/root for every file it touches so I have to have extra lines in the launcher script to reset the permissions before I fire up the docker container, but your use case is slightly different so it makes sense!
I spent the whole day trying to get different containers working and it turns out docker was changing owner/group perms on my filesystem so nothing was working and I was getting CUDA errors. Time for me to spend those generous credits lambda labs gave out the other day … the amount of fiddling I have to do trying to run it locally on a janky 1070ti is just too much for the returns I get.