End of part 1
Thank you for pulling off an amazing class under historical circumstances! Thank you, Rachel, Sylvain, and Jeremy! Thank you, Thank you, Thank you!! And thank you for making us wear masks!
Hanson’s paper seems to be about adding stochastic noise to improve convergence.
You have change the dimensionality in CNN to a 1D CNN. https://pytorch.org/docs/stable/nn.html. 1D CNNs capture the order of the text as well. I have seen them as way of prepossessing.
Are there pointers for easier research papers to implement for DL starters ?
I’ve also created a Wiki: 2020 ML Interviews Resources & Advice(s), please contribute!
Definitely suggest kaggle competitions. They can have data separated in other ways besides random split.
Good to get in project groups as well.
Will be released a certification, or something to rember this great journey, signed by Jeremy?
Applying CNNs to NLP is apparently a thing, at least before transformers!
End of the class
Thanks for a great course amid this pandemic!
I learned a lot even though I went through last year’s course.
I hope Part 2 is coming up soon!
Mikaela from the USF Data Institute will distribute certificates to those registered for the course (although unfortunately they are not signed).
Especially the big “Browse State of the Art” button.
I got the book, do you think it will be possible to have your autographs (all of you 3) on it, in the future? Thank you so much, amazing journey
Thank you so much for the best weeks of Deep Learning even under these stressful times!
Thank you Jeremy, Rachel & Sylvain for all your untiring efforts. This will be a memorable experience.
Thank you Jeremy, Rachel and Sylvain. This has been an amazing journey and I’m sad it finished but excited of everything you taught us and it was amazing to be part of it and to build things thanks to your teachings. Very grateful for being part of the course
Thank you for a great course!!. Wonderful learning experience!! Looking forward to meet you again at Part II!!