Lesson 5 In-Class Discussion


(Charin) #85

Ah so basically we took the user-item vectors, concatenate them with other variables and put them through neural network.


#86

“How many times are deprecating comments made toward the main character”

Personal preference: High
Average user preference: Neutral


(Pramod) #87

People have asked this question earlier and have NN-ed Movie recommendations, but matrix factorization techniques are still used in production at places such as Netflix, Amazon and MSFT.


(Vikrant Behal) #88

Jeremy and his excel skills are exceptional :smiley:


(Ravi Sekar Vijayakumar) #89

Thanks!


(ecdrid) #90

Wonder If he can teach us that also…


(Ankit Goila) #91

Are there any good collab filtering datasets that we can work on/explore?


(Pierre Dueck) #92

Right. I wonder how nns compare to the classical matrix factorization methods?


(yinterian) #93

netflix dataset


(Pete Condon) #94

https://www.netflixprize.com/


(Travis) #95

The Amazon review dataset is an interesting one. Could even do some NLP on written reviews to determine sentiment to add in as s feature.

https://snap.stanford.edu/data/web-Amazon.html


(Pramod) #96

please also check movieLens, LastFM and jester.

PS : Jester is a joke recommendation dataset.


(Louis Guthmann) #97

In the case of NN collaborative filtering, how would you do the exploration phase ?


(Pavel Surmenok) #98

Movies dataset https://www.kaggle.com/rounakbanik/the-movies-dataset


(Travis) #99

That’s a good question. I’d probably spend a little time getting familiar with the data itself (missing values, what the variables are, etc.), then try to wrangle it into a format very similar to today’s lecture. Then I’d try to follow along with Jeremy’s example (I’d use the fastai abstractions!)

How would you approach it?


(ecdrid) #100

I prefer these ones(short and concise)


(Pavel Surmenok) #101

Regarding Jacobian and Hessian: I like how they are described in the Deep Learning book. See section 4.3.1 (page 84) here: http://www.deeplearningbook.org/contents/numerical.html


(Pavel Surmenok) #102

Some good resources on backpropagation:

Backpropagation as a chain rule by Chris Olah: http://colah.github.io/posts/2015-08-Backprop/
Another explanation about the chain rule from Andrej Karpathy: http://cs231n.github.io/optimization-2/
Why you should understand backpropagation: https://medium.com/@karpathy/yes-you-should-understand-backprop-e2f06eab496b


(Ravi Sekar Vijayakumar) #103

lr is provided by us. can we control da in de/da ? or does pytorch figure that out ?


(Kerem Turgutlu) #104

Can someone please share intuitive Jacobian and Hessian matrix explanation with how they are calculated. Thanks