A walk with fastai2 - Vision - Study Group and Online Lectures Megathread

Hi everyone! Each semester I lecture at the University of West Florida through one of our clubs on fastai. Normally I keep it offline but this year I have decided to live stream it along with have a dedicated megathread to discussing the course, asking questions, and networking together! This semester’s version will be using the version 2 of the fastai library (thank you to @Jeremy and @sgugger for all the amazing work you have put in!). This course is designed to be intro friendly (it’s geared towards Undergraduate students in terms of pre-reqs). Now that the intro is done, onto the important bits! Below I have detailed key dates and a syllabus. I am working on the material as we speak but in the meantime take a look at my Practical Deep Learning for Coders 2.0 repository for some v2 codebase to get caught up on v2 if you are unfamiliar and don’t want to wait!


Block 1: Computer Vision
Lesson 1, Part 1
Lesson 1, Part 2
Lesson 2
Lesson 3
Lesson 4
Lesson 5
Lesson 6
Lesson 7

Block 2: Tabular Neural Networks
Lesson 1
Lesson 2
Lesson 3
Lesson 4

Block 3: Natural Language Processing
Lesson 1
Lesson 2 (pending)
Lesson 3 (pending)

(Colab Links) Notebooks we have covered:


A walk with fastai2

How is this different from Practical Deep Learning for Coders?

This course focuses on a subject-to-subject basis, exploring the datablock API to it’s full extent and applying various techniques that are not taught in either course (such as feature importance and k-fold validation).

What do I need to take part?

You need a Google account and a Paperspace account. Both will utilize their free notebooks. We will mostly be working out of Google Colaboratory for all except Natural Language Processing, as Paperspace is nicer for that with data persistence.

When is this?

This course will be running from January 15th until the end of April. This will coincide with Jeremy’s run for Practical Deep Learning and I highly recommend doing both (if it winds up being online available shortly afterwards and or live-streamed). The livestreams will be from 5pm to 7:30pm Central Standard Time on Saturdays.

How is this structured?

We will do lecture for one hour to an hour 15 minutes, with the rest of the time dedicated to debugging and working through the lecture material together along with individual project time.

How do I make the most out of this?

Spend an hour or two a day going through the notebooks and playing around with everything, learning how everything works together. And also get yourself a mini-project to do! If you can’t come up with one yourself, we can all brainstorm together to find a few!


Here is the overall schedule. The format different than Jeremy’s course in the sense of we will move from datatype to datatype, starting with Computer Vision and ending with NLP.

this schedule is subject to change


  • Block 1: Computer Vision
  • Block 2: Tabular Neural Networks
  • Block 3: Natural Language Processing

Here is the overall schedule broken down by week:
This schedule is subject to change

Block 1 (January 15th - March 4th):

  • Lesson 1: PETs and Custom Datasets (a warm introduction to the DataBlock API)
  • Lesson 2: Image Classification Models from Scratch, Stochastic Gradient Descent, Deployment, Exploring the Documentation and Source Code
  • Lesson 3: Multi-Label Classification, Dealing with Unknown Labels, and K-Fold Validation
  • Lesson 4: Image Segmentation, Weighted Loss Functions, State-of-the-Art in Computer Vision
  • Lesson 5: Style Transfer, nbdev, and Deployment
  • Lesson 6: Keypoint Regression and Object Detection, More Pose
  • Lesson 7: Image Generation, Audio, Other DataBlocks

Block 2 (March 18th - April 8th):

  • Lesson 1: Pandas Workshop and Tabular Classification, SHAP
  • Lesson 2: Feature Engineering and Tabular Regression, Permutation Importance
  • Lesson 3: Bayesian Optimization, Cross-Validation, and Labeled Test Sets
  • Lesson 4: TabNet, DeepGBM

BLOCK 3 (April 15th - May 6th):

  • Lesson 1: Introduction to NLP and the LSTM
  • Lesson 2: Full Sentiment Classification, Tokenizers, and Ensembling
  • Lesson 3: Other State-of-the-Art NLP Models
  • Lesson 4: Multi-Lingual Data, DeViSe

Closing notes

This will be my first time live-streaming so this will be an experiment for everyone but I have high hopes that this will turn out to be a successful study group with your help! Please use this thread for any questions and starting discussions about this material, we’re all learning fastai (and especially the second version) together! I will update this post with youtube links to the livestream, as well as post on this thread as well. Looking forward to seeing everyone next month!!!

(Also minor PSA, this is in no way for any credit whatsoever. I am just an undergraduate student wanting to help others learn how to use this amazing library to its fullest potential. Instead of worrying about credit, try using what you’ve learned into a project or two and some blogs, this provides evidence you know the material much better than a slip of paper can in some cases :slight_smile: )


Awesome @muellerzr! Can’t wait to join. Couple of queries though

  1. Are the lectures done daily?
  2. Will they be recorded as well? I am from India and the time is around 4.30 am IST. So it will be good to catch up on recordings if I am not able to make it on some days to the livestream.
  3. What is the required hours per week that needs to be invested to reap the benefits of the course?


1 Like


1 Like

thank you for doing this, all the best and very much looking forward to attending the course.

1 Like

@pnvijay the lectures are done weekly on Wednesday nights (I’ll make that more clear!). They will absolutely be recorded and put onto YouTube whenever the soonest can be (I need to see if youtube has special rules about waiting x minutes before posting your live stream). And on your last one, I’m a firm believer in doing an hour a day outside of the lecture. However that and more will do you great. I always tell my students to just find some project that you can get lost in and the hours and work will come along with it!


Recorded YouTube Live recordings are available immediately after they’re complete.


Awesome! That’s great to hear :slight_smile:

This is awesome. It is so application driven. I am looking forward to it.


Awesome! I’m definitely interested.

1 Like

Definitely interested! How can I sign up?

1 Like

@sairam6087 Just keep an eye here on this forum! We’ll use it for discussion, and when the lectures are live I will provide links to watch and interact :slight_smile:


Oooowww yeeeess!! I’m super hyped for this!!! :partying_face: :partying_face: :partying_face:

1 Like

I’ve just try it out on my Colab it crashed on the first try.

@vanh It will :slight_smile: That’s rebooting your instance which needs to happen to use the updated libraries!

what about the error message?

It’s just saying you crashed (which is more or less what we did). I figured out we can do this to force a reboot to occur and start using the library with just code. Easier than telling people to reboot their instances IMO

The source is that os._exit()

Aka safely can be ignored :slight_smile:


I am interested!!!

1 Like

It looks really promising. Only what I want for this is transcript-like notes.

1 Like

I can’t promise transcript like, but I’m doing something a bit more with the course notebooks that can help out the non-English speakers :slight_smile: I know that’s a big concern for you so if it’s okay I’d like to DM you with an early notebook or two and see if that format is suitable enough for you? @dsfsffefsdfdfdsdfd

1 Like

Yes. Please! I would be really grateful if you would do that.

1 Like