Deep Learning Certificate Part II plans

Thanks Jeremy. I actually received my acceptance from Rachel just before your reply, looking forward to the class!

Having just run across them I’d love to see a lesson on deep compression. Squeezenet and Quicknet seem incredibly promising for acceleration of deep learning and for making the jump to mobile platforms.

Part II seems pretty packed though so this may be a topic for Part III :slight_smile:

1 Like

Love the idea of part III :smiley:

or at least somehow continuing the journey together even past part II in one shape or other :wink: Maybe the forums can be such a place? Let’s see :slight_smile:

2 Likes

Great topics. Looking forward to the course eagerly. Since debugging neural networks is something we would be doing often, can you please include visualization of the networks and perhaps include the resulting picture in the notebook so that we would not have to do model.summary() and mentally build the picture. This is more important for rnn as they have loops. https://www.tensorflow.org/get_started/graph_viz has more details on graph visualization.
Thanks

I really like this idea. One of my current goals is to study enough math to understand research papers and implement their algorithms. I created a roadmap for myself to follow.

1 Like

Will part 2 be a MOOC as well?

I am most interested in Time series and applying deep-learning onto structured (tabular) data.

1 Like

That’s an interesting list, although I doubt all that math is going to be that helpful, in particular:

  • Real Analysis
  • Algebra
  • Topology
  • Functional Analysis

Instead, try just learning the math you need as it turns up in papers. And see if you can implement the paper just using the pseudo-code they provide - often the math adds little if anything to that.

6 Likes

Thanks Jeremy! I confess I haven’t actually looked specifically at any deep learning papers yet, so I’ll try to go ahead and dive right in to see how I get along. I am also a bit embarrassed to acknowledge that I registered an account on Kaggle 3 years ago, but I have not made a single submission to date because I’ve simply been too intimidated.

I should add though, that my motivation for studying those additional topics in math (beyond multi-variable calculus and linear algebra) are because I find them intrinsically interesting, and I hope to some day return to school to study math. I am certainly convinced that none of it is necessary to study machine learning, so I apologize if it seemed like I was propagating that myth!

1 Like

That’s very (very!) common - but it’s easily fixed: just try submitting one of the pre-created benchmarks provided by kaggle. Then the next day, try to make it just a tiny but better (e.g. if the benchmark was all zeros, try replacing it with all 0.5’s). Do that every day for the next 3 months, and you’ll be amazed at your progress.

3 Likes

I signed up for kaggle 4 yrs ago, completed 3 other MOOCs on ML, but it wasn’t until I started doing part I of this course that I’ve made my first submission :slight_smile:

I must say that my understanding of what it means to do deep learning has evolved quite dramatically over the course of last month taking with it my conception of what it means to learn in general for quite a ride :smile:

1 Like

I like the deep reinforcement learning (AlphaGo) idea. Also it would be great to recognize video, or perform super resolution. How computationally (and intellectually) feasible these are is another matter…

When you say pre-created benchmark by Kaggle, do you mean the bits contributed in the Kernels & Discussion pages of the competitions? I find those two pages can give lots of guidance.

Actually I just meant the sample submission that is provided for each competition. Generally they’re just all zeros, or something similarly basic. Start with the simplest possible submission!

I have also completed 3 MOOCs in ML (The Analytics Edge, Learning from Data, and of course, Professor Ng’s Machine Learning course). I am currently enrolled in Columbia University’s ML course on edX, but the learning experience there seems a bit sub-par because of the lack of feedback from the autograder and the rather inattentive staff. Hopefully they’ll be able to work out the kinks in the next iteration.

Anyway, I left and joined up here instead, and finally made my first Kaggle submission after watching Lessons 0 and 1! It’s interesting that the learning approach is in a sense the opposite of the highly academic nature of the Columbia course, but considering how long I’ve been putting Kaggle off, I think it was the right decision to sign up here :slight_smile:

I think what is going to help me to keep the ball rolling is the amazing community here. For me, the forums have always been a huge part of the learning experience, so I look forward to participating here as I go through the course.

2 Likes

I love the idea of a submission a day for three months. With cats-and-dogs ending soon, I’ll have to turn my attention to something else. Perhaps I’ll check to see how good I can get at classifying fish!

I hope Part II will become available in roughly the same format as Part I (video+wiki+forum).
As for subjects, anything timeseries related will definitely be interesting.

In my short career in data science, I find that there is a lot of information on tools and technologies, but not so many guidelines on what to use when. In trying to get to grips with deep learning in particular, I struggle with how to select, create and evaluate architectures. Of course, a lot of this is experience, but it takes a very long time to build experience if you’re just meddling about without any guidance :slight_smile:

Tim

2 Likes

Roger Peng, Johns Hopkins Authored this

https://leanpub.com/exdata

It’s an R resource taken from their Coursera Data Science Specialisation. It’s cheap. I read it a few months back. There is also a R swirl package for learning exploratory data analysis. I pretty sure you don’t have to be signed up for the specialisation to get access

Thanks for the tip. Chapters 5-7 look good. For the R stuff I think I prefer http://r4ds.had.co.nz/ .

I would say Hadley Wickham is one of the ‘Geoffrey Hintons’ of the R world and has contributed many very useful packages.

2 Likes

Hi @jeremy,
I’m a part 2 v2 international fellow. Trying to get access to MIMIC - III data set and hopefully replicate a paper (https://arxiv.org/abs/1802.02311) using NLP techniques taught in the course. I’ve completed the requirement that I should take a training course, but it seems I need to provide a reference - ( * information required of students and postdocs, and of anyone—regardless of rank or experience—who is not listed in a directory or other easy-to-find page of his or her organization’s website. Do not list yourself as reference! If you do so, your request may be discarded without notice. ) - to get access to the Database.
I’m not sure what kind of profile qualifies as a good reference, would like my request not to get discarded. could you advise pls?
Update : Managed to get access. It was mentioned that access, if approved, will be granted after several business days. Somehow received approval confirmation in a few hours :slight_smile: