Ask Jeremy anything

Hold on. Did I just hear Jeremy say he didn’t go to university?

If I heard right, would be interested in how you learned what you teach today.

4 Likes

What are you going to do next? Are you going to start another company? Are you going keep focusing on teaching deep learning and developing the fast.ai library?

3 Likes

@rachel Missed this one too.

Study as much as you want as more knowlege is more beneficial always…
Plus doing mini projects and mending them your way is what the general notion on this forums…

2 Likes

Will fast.ai implement a version of the AutoML method in the future?

3 Likes

here’s something interesting: Train DNN to play Atari on a personal computer

When @Rachel and/or @Jeremy coming to Houston?
I would like to invite you to our Data Science Meetup.

Best advice for a 20-year-old undergrad aspiring to be a DL Researcher?

3 Likes

Are there any projects you need help with?

Just wonder how quick can you train on Imagenet and CIFAR10 on Google TPU? :thinking:

Hello,

What is difference between the Cyclical Learning Rate and the Stochastic Gradient Descent Restarts if both of them tries to adjust learning rate? When use one over another one?
Thank you.

3 Likes

I see that there is a new repo https://github.com/fastai/fastai_v1 containing notebooks which cover some of the library concepts from scratch. Is it for an upcoming course? :slight_smile:

1 Like

I just checked the link, quite interesting! Maybe you should start a new thread, so it becomes more visible. Maybe we can all help.

Cheers!

@sgugger could you tell us, please?

I think it is. In the fastai github thread, someone from Jeremy’s team mentioned that they are planning a new version of the Course starting next October with a new library rewritten from scratch with a change in the API. I don’t know the details but as I remember they are still using pytorch.

It’s the beginning of the new version of the library yes, though the very early stage for now.
I think Jeremy will create a topic dedicated to this once we have something a bit more consistent to show, to ask for feedback and help.

1 Like

Thanks

There’s a forum for it: #fastai-dev

1 Like

What are the future plans for fast.ai and this course? Will there be a fast.ai Part 3 :smiley: ?

If there is a part 3 I would really really love to take it.

I just saw this question being addressed on the lesson 14 video so I just wanted to chime in with enthusiastic support for more! @jeremy, @rachel - Just in case you haven’t heard it enough yet.

Though, perhaps, this doesn’t necessarily have to be literally in the same format or even as a class at all. Part 2 is very different from part 1 in that it’s leaving a lot more up to the students and seems to be more of a “teach how to fish” approach (read the papers!).

What has occurred to me is that this community is the perfect platform to run with new papers, discuss them, and get them implemented. There’s talented students, there’s an established forum, and there’s a great code base with actual users along with great leadership. But Jeremy and Rachel can’t do it all. And there are already great contributors doing this. It seems rather informal though at this point.

I wonder if there’s any benefit to having a dedicated space that tackles new papers (and maybe overlooked old ones) on an ongoing basis, and out of that perhaps some of it makes it to the fast.ai codebase if it proves to be useful enough. To me this would be a great “continuing education” (a part 3, essentially), and would almost certainly be generally beneficial to the community in many ways. I was planning on going out on my own and trying to implement papers in isolation with a blog to go along with it after I got done with part 1 and part 2, but it seems to be it’d be much more effective to have a dedicated effort here.

1 Like

@jsa169 the #deep-learning forum is a great place to have this continuing discussion. Or you can join the #fastai-dev effort, which naturally incorporates a lot of research etc. Happy to hear other specific suggestions, or feel free to try things yourself that you think might help.

1 Like