New MOOC: deeplearning.ai

As far as I know, no. There was a rumor saying late August, but I’ve no idea where that came from.

2 Likes

My best guess is that they’re stuck on sorting out access to GPUs. There was some talk about partnering with Nvidia and I’m guessing arranging that kind of partnership and integrating it with coursera’s hub is non-trivial.

Edit: Maybe that’s just wishful thinking, though. :stuck_out_tongue:

1 Like

That’s what I was thinking too - hoping to get the entire thing done within a month to save on some cash.

I’ve started taking the course, and so far (2 weeks in), it’s python + numpy. There is a lot of math, but he emphasizes that it’s not as important as the ability to implement the code. It’s not really necessary to understand the math to be able to implement the programming assignments, as long as you know how to translate the equations he provides into numpy.

1 Like

The new official dates for the new courses are early October for Course 4 Convolutional Neural Networks and “soon after” for Course 5 Sequence Models. Info came via mail.

4 Likes

Apply for financial aid its easy

OK, here we go: https://www.nvidia.com/en-us/gpu-cloud/ … maybe the next courses will be launched now.

1 Like

I audited the course but I can only see week 1 preview. I am not able to see any other week.

I’ve completed the Coursera deeplearning specialization (5 Courses) and simply wish to inform everyone that all courses are available the 3 listed above as well as one called Convolutional Neural Networks and the final Sequence Models. I have not participated in the fast.ai course and thus cannot comment on any comparison.

1 Like

I just completed the Deep Learning Specialization on Coursera, and it is fantastic! Turns out it is actually a part of CS230, a course being taught right now at Stanford by Andrew Ng. I am now starting the fast.ai courses.

1 Like

I did this in Aug 2018. In the 7 months since, I often find myself returning to the deeplearning.ai materials to 1) understand the math, 2) understand the intuition behind the math, 3) refer to the notebook assignments as barebones templates.
The first 2 points have really accelerated my understanding of Fastai part 2 (2019). By remembering the math and the intuition, Jeremy’s notebooks became easier to understand since in a way, I could guess where he was going with them. Without specifying the motivation at the start, it really requires patience and focus to step through Jeremy’s notebooks to observe the changes but not really knowing what he’s getting at till the very end when he’s done demonstrating and going back to explain the motivation. Andrew Ng tells you the motivation upfront but you have to be patient about working through the math. Some concepts are easier to learn with Andrew’s approach and others with Jeremy. For example, Andrew’s notebooks use numpy arrays to build the NNs from scratch and I find numpy easier than Jeremy’s torch tensors in Part 2. On the other hand, when Jeremy demonstrates with his notebooks the impact of certain parameters or routines, seeing is believing and faster than listening through Andrew’s explanation which builds upon certain pre-requisites e.g. solid linear algebra, regularization. It’s super fast if you are already familiar with those concepts to build on but if not, you can get into the deep end with wikipedia.

I strongly advocate doing both if you have the time. Whichever one you start with, the one you do next will be way easier. I would love to form a discussion group if there’s interest to revisit both sets of materials

1 Like