[Mind Map] Deep learning Research Progress in past few decades

Hello fast.ai community,

With all the awesome content taught in our course, I got increasingly interested in understanding the holistic view of DL research and progress.

Click here for the complete mind map

In this mind map, I am focusing on three things:

  1. Algorithms and NN history: Timeline of important milestones which lead to current DL revolution. And state of the art pre-DL and what parts of it are still relevant.
  2. Timeline of hardware and associated software milestones.
  3. Releasing of data sets which significantly pushed the research and state of the art

All three pieces are color coded. I created this one a few weeks back but have been waiting to “complete” it (which in retrospect would never be completed as such :slight_smile: ) But realized it would be best to share the work in progress and take early feedback and possibly crowd-source the missing links :slight_smile:

Apart from this, I am also making another mind map for applications specific progress: CV, NLP, Speech

Please let me know if you think I have missed out on any major milestones. I am sure I have.



While composing this message, it just clicked to me that it might be interesting to add a 4th DL frameworks dimension. What do you think?

Here are 2 terrific resources for DL history:

Hopefully that might provide some useful extra material for you :slight_smile:


Thanks @jeremy. Will look into these two articles.

The timeline post batch norm is pretty empty. What do you think are equivalent milestones in terms of architectural breakthroughs?