A hard and uncool style of learning fastai-v2

I am sure there must be many better or more efficient ways of learning fastai library, but somehow I find myself keep coming back to this hard and uncool approach. In case you are alike, what I have been doing may be a little help for you to march on the same path. But first, let me explain what do I mean by uncool and hard.

What does uncool mean?

  • not apply fastai on interesting and fancy applications
  • not even working on notebook demos
  • but focusing on source code notebook tests (fast, short examples) only

What does hard mean?

  • taking each func, class, method out of the source code notebook
  • wrapping it up into a single .py file to experiment and ponder its meaning and usage
  • it is hard, because you force yourself to walk through each life of source code
  • keep revising every .py file like this PrePostInitMeta.py to keep enhance the understanding of its usage in itself and in the big picture in relation to other classes of the library.

Why I do this?

  • mainly because I am a boring person (hate to admit), I don’t have any project which is interesting enough and feasible enough for me fully focus on
  • also because I want to see progress, even just a little, everyday. a func per file, is doable!
  • because If I manage to persist to the end, I expect fastai-v2 to enable me to be much much more …

If you are interested, here are those little .py files I have created and have been exploring everyday.

This is an on-going project, I have only just started, you are welcome to join and discuss the usage and understanding of fastai-v2 here with me (though Jeremy recommended to jump in when it is officially release, because currently v2 is adding and removing classes/funcs everyday). I will try to update my experience with this approach here.


This is a good idea, although I don’t recommend it to beginners, as it involves too many details. However, it’s also possible to make the procedure more accessible, the basic idea is to find patterns, for example, ‘monkey patch’, what it is, why it exists, what’s alternatives, etc. There are not much patterns out there.

Did you checkout the new course? Part 2: Deep Learning from the Foundations, it rebuilds the fastai library from scratch. https://course.fast.ai/part2.html

That’s not the same. The new course Foundations covers the principles of everything, that’s great, but not each line of code. So, I guess it still make sense to try to understand the full source code, rather than just knowing the basic ideas. I have spent 2 days to understand the DataBlock API from fastai V1. Several things are missing in the documents you would notice that only if you are trying to use it for processing other kinds of text, for example NER.