Just finished it - would recommend to people who want a bottom up approach for learning deep learning algorithms. It definitely complements Jeremy’s top down approach and helped me to understand couple of “why” questions.
NB - There was supposed to be a miscellaneous thread where I could post this, but couldn’t find it - sorry for that.
I loved these courses but have been waiting forever for the last part on Sequence models. I took those courses first before coming to fast.ai. To me, I like that progression as Andrew Ng does a great job on concepts and basics but implementation of real code is missing whereas fast.ai is amazing at showing how deep learning experts actually do things.
I’m waiting for the Sequence models course too ! I agree with the implementation of real code, even though, from what I see so far (only on week 2) I’d love to have a detailed of fast.ai code, how it calls pytorch etc.
Well there were couple of things explained in much granular format using numpy -
e.g - As someone already mentioned inner working of convolutional network in numpy.
I was interested in knowing how some of the weight update algorithm works - RMSProp , Adam, weight update using momentum.My favorite was actually face detection and face recognition using conv net - it was an application of one shot learning.
But I still have to maintain that take the best of both of the two worlds - lessons from fast.ai Part 2 - as well as deeplearning ai - you are good to go in deep learning.(…thats my subjective view)
What I found that , if you have done the courses here, you should be good to read scientific papers and implement then yourself without much effort. There was not a ton of value that I got from Coursera. I found that the assignments were too trivial and to much was set up for you. There were one or two things that I liked, like some things in pure numpy. If you do want completeness you can read the deeplearningbook and watch the lectures for that. It’s covered on a chapter basis.
Most of the math in papers is just difficult notation but logical easy to grasp.
I still dont understand a paper 100% the first time i read it, but after a few times it gets fairly easy to grasp