Getting Started with fastai-v2

When Jeremy tweeted about new fastai-v2, I wanted to jump and start learning about the new version of fastai and contribute to it. But I did not know where to start. Luckily Jeremy posted about code walk-thrus and in his first code walk-thru he gave suggestions on how to get started. This post is on how to get started based on Jeremy suggestions.

  1. The code is available in https://github.com/fastai/fastai2.

  2. Clone the fastai2 repository. git clone https://github.com/fastai/fastai2.git

  3. Inside the fastai2 folder you would find the dev folder which contains all the notebooks like 01_core.ipynb,08_pets_tutorial.ipynb etc.

  4. There are 2 approaches to install the required packages for running fastai.
    One approach is to use the environment.yml from the root of fastai-dev folder to create a seperate environment using conda. You can use the below command for it.
    conda env create -f environment.yml
    Another approach is to just install everything using conda based on the readme in fastai_dev branch.
    conda install -c fastai -c pytorch jupyter "pytorch>=1.2.0" torchvision matplotlib pandas requests pyyaml fastprogress pillow scipy
    pip install typeguard jupyter_nbextensions_configurator

  5. After you clone this repository, please run nbdev_install_git_hooks in your terminal. This sets up git hooks, which clean up the notebooks to remove the extraneous stuff stored in the notebooks (e.g. which cells you ran) which causes unnecessary merge conflicts.

We are all set to explore fastai2.

We may be tempted to look into 01_core.ipynb which Jeremy warns to avoid doing so as it is very complicated as it sets up python in a different way. It starts with Metaclasses, decorators, type checking, monkey patching, context managers. If you are keen in learning advanced python concepts then it would be the right place to look. Jeremy recommends to start from 08_pets_tutorial.ipynb which is a tutorial notebook and shows how to use some of the low level fastai functionalities like Transforms, fastai lists, pipelines.

You can also watch the video where Jeremy did a code walk-thrus.

20 Likes

This should be made into a wiki so it can be edited as things change.

@sgugger can you make this a wiki post.

1 Like

A post was merged into an existing topic: Fastai v2 code walk-thru 1 notes

It’s a wiki now.

3 Likes

I’m new to fast.ai and coding in general. Right i’m thinking where would be ideal place to start my path? Is it the latest material or from the very beginning?

Start with learning Python.

2 Likes

super dumb question - is there a preferred way to “install” fastai v2?
something similar to “python setup.py develop”?

i wanted to port some v1 work to see how it looks on the new api and as a warm up.

maybe i could just softlink a subdirectory to fastai_dev/dev/local from my project folder?

See the FAQ here: Fastai-v2 - read this before posting please! 😊

That’s how I set it up in Colab but works everywhere :slight_smile:

1 Like

oh man so sorry - missed the faq - thank you!

1 Like

All good! If you run into troubles feel free to ask or ping me :slight_smile:

1 Like

Perhaps symlink the local dir to a dir called fastai2 in your site_packages folder? Then import from fastai2?

2 Likes

@jeremy - symlink was my thought too - ran into problems when I called it fai2 but if I named symlink local it seemed ok. I forget which files I was importing that didn’t have relative imports will post more tomorrow when in front of computer

Yeah tell me what’s assuming local so I can fix it @313V :slight_smile:

in the file:

fastai_dev/dev/local/torch_basics.py

changing

from local.imports import *
from local.torch_imports import *
from local.core import *
from local.torch_core import *

to:

from .imports import *
from .torch_imports import *
from .core import *
from .torch_core import *

Thanks @313V fixed now - my dumb fault! :open_mouth:

1 Like

I must be missing something very basic in getting starting - how to import fastai package in my project.

I followed the instructions to clone the repository and I’m able to run the notebooks and follow the walkthroughs, everything works there.

But now I want to use it in my project, which has other notebooks or code files, in another path.
How can I import “fastai” packages?
Do I need to set up a PYTHONPATH?
Do I need to deploy it to site-packages?
I don’t see a “fastai” folder, only a “local” one (like referred in the pets tutorial).

Perhaps symlink the local dir to a dir called fastai2 in your site_packages folder? Then import from fastai2?

Yeah, that seems to do the trick.

In commandline:
cd /home/user/anaconda3/envs/fastai/lib/python3.7/site-packages
ln -s /home/user/github/fastai_dev/dev/local/ fastai2

And then in Jupyter:
from fastai2.data.all import *
from fastai2.vision.core import *
(and following from pets tutorial)

But is this hack the way to use it?

Launching Jupyter-Notebook from WSL.

We see in the code walk through Jeremy seamlessly uses WSL to launch Jupyter Notebooks for development purpose.

I tried doing the same from my Windows 10 system -> created fast_ai enviroment -> launched Lec 08 Notebook. But it keeps mentioning Connection failed (Cannot establish connection to kernel).

Did anyone else face the same issue before ? If so, what’s the fix ?

I’m always doing that inside an SSH connection to my university Linux computer.