Meta: Moving beyond what is taught in this class

This question is specifically targeted @jeremy Since Jeremy and the fast.ai blog discuss at length about the teaching philosophy I thought that it might be relevant to post this on the forums.

I have been following the course from outside and I am thoroughly enjoying the material. From your background, it seems you are mostly self-taught.

Even after covering a significant chunk of the material in the course I am not able to make much progress on reading deep learning papers.

I would love if you can elaborate on your personal learning path and takeaways from it that might be useful to the rest of us. I would be even more interested in the process you follow for learning a complex topic (not necessarily deeplearning) and the role of your math knowledge in it. As an example: I found the original Dropout paper really hard to read even after I understood the concept as explained in the class.

TLDR - How to learn more about deep learning on our own without your help in simplifying each and every topic.

Thanks.

I have been asking myself the same question and here is my take specifically for Deep learning…

I am realizing that there are so many other better venues to learn state of the art than reading latest papers, unless you are already an expert and have a good grasp of state of the art, as of today. Which I am not certainly.
Participating in Kaggle competitions is one such way. Some advantages:

  • You can apply what ever you know so far and is a litmus test on where you stand in the state of the art.
  • Make your Deep learning pipeline efficient. I am realizing that there are so many hyper parameters which can be tuned, it is impossible to just keep trying. It makes sense to invest some time in getting your work flow optimized for running experiments and evaluating them and making tuning decisions based on evaluation. In most cases, engineering is the real bottleneck.
  • Read through Kaggle forums. I especially found some strategies very novel mentioned on the forums. Also, some not very popular pre-trained networks.

Having said that, I think it is useful to keep an eye on latest papers to have a sense of where the field is moving and know which paper to read when you have a specific problem. But I personally found reading papers thoroughly would have a better value once you are comfortable with state of the art.

2 Likes

Thanks for the tips. I am already learning a lot with the Kaggle Competitions. Still I am really curious about how Jeremy learned so much given DL was even less accessible (no CS221n) a couple of years back.