As the title says, I’m extremely, extremely excited to interview our teacher and ML Hero!
I’ll be interviewing Jeremy on the Chai Time Data Science Podcast, the interview will be released in video and audio.
Jeremy has very kindly allowed questions from AMA, please feel free to leave your questions as replies here. I’ll try my best to include them during the interview.
Note: I’ve listened to all (yes, all without a doubt ) of the interviews that Jeremy has done so I’ll try to ask questions that weren’t asked on these.
Here are the themes/questions that I’m interested in asking about:
- Jeremy’s suggestions on how to approach the fast.ai material
- Deep Learning and startups
- Deep Learning and Medical Sciences.
- Spaced Repetition specific to DL, how does Jeremy learn DL.
- Coding advices!
- Jeremy and Sylvain’s methods of collaboration and research, both on projects inside and outside of fastai
- How/on what does Jeremy spend his 12hr/day learning hours on.
- The Bike in Jeremy’s twitter profile
I’m sharing these points so that everyone can either include questions from the themes above. However, please feel free to include any questions here.
Thanks in Advance and a huge thanks to Jeremy for agreeing
Meanwhile, please feel free to checkout the podcast:
Congrats Sanyam! I can’t wait to hear all about it. A few I’ve had:
- Thoughts on tabular NN’s? Have they hit a ‘limit’? Any crazy theories/ideas to further improve them besides the word2vec ‘hybrid’ that we have been using?
- Academia vs Self-Taught (should you go to graduate school still or is the fastai courses + a few other of the ML courses out there enough, or how is that outlook changing)
Excited to hear about it Thank you for doing this and thank you Jeremy for giving us your time! (and everything!!!)
Would be great if you could ask any tips/tricks/suggestions on getting started doing independent research in this field.
Great to hear that. it would be great to ask him about challenges and best approaches to adapt deep learning models to healthcare workflow and get real world clinical impact.
Yes, this is what even I wanted to ask, please don’t forget to ask this.
Interested in that one, too.
Now we have a fast.ai MOOC on NLP, When would Jeremy launch a course in Computer Vision?
Would be nice to hear about status and future plans for v2 library and swiftAI. Also about part 1 and part 2 courses next year. Last course especially part 2 was quite different. Will it be in the same format next year? Thanks!
This is great! Would love to hear his thoughts around what he has heard from industry on the need for broader engineering skills beyond a data scientist’s typical python-based skillset, i.e. to not only have the skills to train a model but also to be able to put it into production…
You have already good questions to ask from Jeremy. I have two more questions to ask please
- Jeremy view on how to get better at programming specially python, on how to go from novice to expert in python for a data scientist
- Jeremy view on what other technologies a data scientist should also work on besides core data science libraries.
Awesome news! Some questions:
- How does the fastai team find these “obscure” papers like Leslie Smith’s learning rate finder (worth implementing)?
- The fastai library seems to be the answer to the fastai (company) mission of democratizing AI. So far the focus of the library has been core DL and applications, but the fastai (company) mission is also about ethics, bias, etc. Are there plans to incorporate tools that often talked about but not readily available like: de-biasing, explainability, etc.?
- I struggle a lot with fastai naming, and I know the official position re:
abbr.md but I find the code difficult to read when for example
trn is used instead of
train. My question is: how do you balance conciseness with self-explanatoryness or “don’t burn my brain to decipher what’s in a name”.
Thanks a lot for the library and being a true inspiration in so many ways.
Re: tabular NNs. This: https://github.com/Qwicen/node worth checking.
I tried Node for a bit, couldn’t quite get it to work better in fastai when I tried to integrate it. But I could certainly revisit it soon.
Revisiting it tonight, let me see what I can do.
afaik, Jeremy is using spaced repetition only to learn foreign languages. He mentioned he is learning DL and programming simply by using those new things.
a question from my side: What is the importance of having a mentor and how to find one?
Are there any plans to integrate working with videos into the
We’re just starting to see impressive pretrained models in the past 2 years or so; it’s still early stages, but a good time to get a head start. I think there’s lots of room for fastai type improvements with such models.
@init_27 When will you stop accepting questions? i.e. when will you be interviewing Jeremy?
This is excellent @init_27! I can only imagine how excited you must be! And big thanks to Jeremy for taking the time out.
If possible could you please ask these questions:
It seems like Jeremy has time for everything - Twitter, forums, dev ! How does he(Jeremy) manage his time? This is really a key question that I want to ask. I remember him saying once, he doesn’t do any meetings, he doesn’t watch tv… etc, and does what he likes, but specifically, does he have a strict routine that he follows?
How to prioritize what to learn? With new research papers coming out everyday, new technology and changes happening accross Visual, NLP , Tabular, Optimizers etc - what, according to him is the best way for us moving forward?
Is it important to get a firm grip on something first - for example, master vision applications first and then move on to NLP, or would he recommend doing different things during the day?
- Are there any resources that he recommends that we should look at daily?
Yes please! Very similar to my question on resources.
Perhaps the most important question ever!
This is fantastic! Thanks for the opportunity, @init_27
- We all know and appreciate Jeremy’s perspective on the DL ecosystem from a software point of view. What does he think of the changes that are happening on the hardware side, like custom NN chips, FPGA etc. for both training and inference.
- How does he feel about running inference on the edge as opposed to the cloud.
I’ve been wanting to know about the bike thing as well. Also cricket.
Anyways there’s also one more thing
One of the things that’s really interested me about fast.ai is the way they’ve gone about constructing the library then doing it again for version 2.0 as well the numeric programming library BaseMath Jeremy wrote in swift. I’ve been approaching the notebooks but I’ve been going about it linearly. I’m really interested in how to make a library from scratch . How do you suggest going about the the fast ai material in particular the walk thru videos. It’s something I’ve been wanting to know and would love to hear his thoughts on this.