Easy local conda setup

I am a bit late to the party with experimenting with the SD NBs from lesson 9, I have not come across a local setup “guide” yet on the forums except the links to docker. So I have setup a repo where i host my env.yaml file which can be used to create a new working conda env (installs pytorch, cuda, fastai, diffuser, hf hub, ipywidgets, jupyter etc.) from scratch.

Description and env file is here: GitHub - MJPansa/fastai_2022_v2: A collection of NBs and experiments from the 2022 fast.ai course V2
(if you want to change the name of the conda env you can do so on line 1 of the yaml)

Requirements:

  • Nvidia graphics card with >=10 GB VRAM, maybe 8 GB works as well. NOTE: You can probably reduce the image sizes to get away with less. I can only say i tested this with default notebook settings of lesson 9 SD deep dive and it took 9.8 GB at one point, so no guarantees. Happy to hear feedback though. I am personally running RTX 3090s. Image generation time was around 2-3 seconds per image in the first part
  • A linux environment with conda (if you don’t have just follow the link the repo and install it) installed, either vanilla Linux or when using Windows i highly suggest using Linux through WSL
  • REMINDER: When running the NBs make sure you have created an account at Huggingface (https://huggingface.co/) and that you have accepted the terms and conditions for using the SD model (CompVis/stable-diffusion-v1-4 · Hugging Face)
  • If cuda 11.6 is somehow not supported for your card, try going back to 11.3 or even 10.2. To do that you only have to change the version number inside the yaml file

Hope that helps at least a single person to save some time setting up the environment locally.
Have a great day everyone!

Jeffrey

5 Likes

Thanks for sharing! this might come in handy when I want to do a clean rebuild locally. I have a 1070ti with 8GB and it seems to be “ok” with the first few steps. But haven’t tried it with the deep dive notebook yet.

BTW, this is part 2 of the 2022 course. The V2 may throw people off as V2 was the second offering of the course a few years back if I’m not mistaken. So V2 had part1 and part2 or something. I’m not sure what “version” this course is … probably V5? :sweat_smile:

Great initiative! thank you!

I’d call it “Deep Learning from the Foundations v2” or v4 pt 2 :slight_smile:

1 Like

The naming convention seems to have moved from version number to year. So this years course is course22 and course22p2 according to github repo names.

so the original name with a p2 (as in part) instead of v2 is probably perfect.

Ah ok! When I got the part 2 notification, the category it was under was v5 ( https://forums.fast.ai/g/part2v5 ) that’s why I guessed it was probably v5 but I agree Jeremy has probably moved away from using the “version” nomenclature and now it’s years and parts as it were.

1 Like

Honestly I use both kinda randomly. The forum category we’re in now is part2v5, but the repo is course22p2. :slight_smile:

2 Likes

Ok, what i get from this discussion is that no one had issues with the setup i guess which is great :grin:
But I also realized there was a naming issues when i tried to create a new conda env for v2/p2 and it said env already exists :melting_face:

EDIT:

  • one more trick you can do is setting an alias on your system which significantly speeds up your process of picking the right conda env and starting jupyter in the correct folder
  • just edit your bashrc with your favorite editor eg. nano ~/.bashrc
  • then append to the end:
    alias name_for_alias="conda activate name_of_your_env && jupyter notebook /absolute/path/to/work/dir" so for me it might be alias fast="conda activate fastai_2022_v2 && jupyter notebook /home/jeffrey/Documents/projects/fastai2022_SD" (make sure to not put any whitespaces left or right of the equal sign)
  • so now i only have to type ‘fast’ in the terminal and and jupyter starts instantly with correct env and folder
1 Like