Note: This is a wiki post - feel free to edit to add links from the lesson or other useful info.
Links from lesson
- fill me with interesting stuff
Other useful links
My timezone is somewhat incompatible with the live stream so I hope it’s OK to post a question in advance here.
Since we learned about tabular data last time and nlp in this lesson, I’d like to ask how to combine both. If I had a table containing continuous/categorical features as well as a text column, could I create a fastai model using both types as combined input? Ideally I’d like to use a pretrained language model for the text part.
This post had chapter 13 and 14 before. These will not be discussed in this lesson?
IIRC last year those topics were also discussed in the final class so was wondering if it was going to be the same case today.
Also, could you please convert this into a wiki so we could add resources?
No, they won’t. This is NLP only in the end, with a deep dive in the RNN/LSTM.
And I converted the top post to a wiki.
Sad to know this is the last class. This has been the highlight of my quarantine. Thank you all for the hard work, dedication, and awesome content.
You absolutely can! One way to do so is to use the so-called “Siamese network” architecture.
Is there any resources here in the forum for interviews preparation on ML/AI engineer positions ? It would be great to get some insight into the expectations for this kind of roles.
Hi all, are you guys going to offer Part 2 of this course?
Given that there was a lecture on CNNs and ResNets in the previous Part 1 (2019), could there be an extra recorded lecture on this?
Jeremy mentioned that he is planning to make a part two. Will it be online/open like this course version?
fast.ai alum Sravya Tirukkovalur @sravya8 put together this helpful slide show with info after her interview process:
Answered in video. Yes, but the details (time/location) are unknown at this point
Jeremy announced at the beginning of class that yes, we intend to do a part 2, most likely covering the rest of the book (and potentially other topics that become relevant in the meantime), although we don’t know any specifics of when this will be. I’m sure we will continue in our commitment to making the material freely available online.
I wanted to see if anyone had interview resources (sites, books, etc.) as well. I’m currently applying to jobs and find studying for the ML part is my bottleneck.
The only one I know of is:
@Raymond-Wu I’ll create a wiki so after the lecture so that we can community source it after the lecture
@jeremy, for transfer learning to be useful, how well does the pretained model need to do its task? For example, can you classify sentiments well if you started with a crappy language model? Or is time well spent to perfect the language model first?
Yes, it is important to have a very good language model first. In fact, all research seems to show that the better your language model, the better classifier you will get at the end.
(In the same way, the better model you have pretrained on Imagenet in computer vision, the better your final model on another task is).
Tag me in it once you do please!