Fastai2 Visual Guide

I’m starting: fastai2 Visual Guide

:point_down:First video (about Datablock): Image Classification – Single Label – Oxford-IIIT-Pet

14 Likes

Screen capture summarizing the use-case.

7 Likes

This is SO AMAZING!

1 Like

The second video is on YouTube in HD:
:point_down: Image Classification – Multi-Label – Pascal

3 Likes

Screen capture summarizing the Image Classification – Multi-Label – Pascal use-case

4 Likes

I posted the 3rd video: End-to-End Training Workflow

5 Likes

Here is the full illustration:

4 Likes

I just released my Fastai Visual Guide GitHub Repo, and a related video tutorial:

https://twitter.com/ai_fast_track

Notebook: 01_data-block-pet-tutorial

Video:

1 Like

This is super useful!

1 Like

Thank you @rsomani95 :smile:

How deep do you plan on going?

I think it would be invaluable if you create such diagrams to show you where exactly in the mid/low level APIs or the callback system certain stuff is implemented.

If you’re not familiar with the internals of the library, it can get a bit tricky to find certain bits of code, especially since they’re usually split up into multiple different functions that call each other (not complaining).

For example, it took me quite some time to realise that opt.zero_grad() is called in the Learner's one_batch(). I spent the better part of an hour looking for it in multiple callbacks – this probably speaks more to my inefficiency, but I’m sure I’m not alone here :smiley:

If (when) done right, I think a visual guide like yours could help alleviate some of the issues brought up in the forum post below, and make it generally easier to hack around within the framework.

1 Like

@rsomani95, DataBlock API is my first playground to create animations and related videos to make that material easier to understand for both fastai users who are transitioning towards fastai2, and newcomers who are outside the fastai ecosystem.

I’m planning extending that to other areas and covering more ground as long as it is doable in a timely fashion. My plan is also to cover the mid level API and show the connection with the high level API. I will also probably include some material that I shared in the Tips&Trick thread I created where I share stuff about both the hight-level and low-level API such the post about inference

Or this one about spli_idx:

Having said that, creating those animations and videos involve a lot of work, and I’m creating them beside my own professional projects and my fastai2 project (timeseries). Therefore, my production will depend on many factors.

If other people would be interested in sharing their knowledge in the Tips&Tricks thread and/or initiatives, where we could centralize the knowledge in some easy-to-find sections, that would be great, and the whole fastai community will benefit from that.

4 Likes

Sharing the 2nd Notebook: Image Classification – Multi-Label - Pascal Dataset
(with detailed steps and explanation)

Run it in Colab: https://colab.research.google.com/github/ai-fast-track/fastai-visual-guide/blob/master/02_image-classification-multi-label.ipynb…

Repo: https://github.com/ai-fast-track/fastai-visual-guide…

I’m sharing a new animation showing how a fastai2 DataBlock object is created using pipeline transforms. This video highlights the first part of a 3-steps process. The example shown is the one of Image Classification using Multi-Labels (Pascal Dataset). The full 3-steps process animation will land soon.

If you like the content, please share, follow, and subscribe.

3 Likes

Have you tried setting a breakpoint using pdb.set_trace() in the function then run a workload? I often find that the quickest way to learn about the execution path leading to a function.

Statically, many IDE has decent enough search functions on definitions and references of anything, but it’s static, and python is very much a dynamically typed language, so the help there maybe limited.

Here is the full illustration: