Lesson 6 In-Class Discussion

This thread contains the in-class discussion from Lesson 6. The Wiki links have been moved to this new thread. Please ask any questions about lesson 6 in the new wiki thread.

15 Likes

Waiting for the link to the lecture 6.

2 Likes

Me too… it is about RNN today, interesting topic

2 Likes

stream working great!

can u republish the link please

1 Like

This is better than NIPS!

2 Likes

It is on at the top of the topic.

sound cuts every 20 sec or so for a sec

Is this the last lecture?

its getting better

penultimate

1 Like

I love the jokes Jeremy cracks! :smiley:

1 Like

Why do we have to have the extra column instead of just adding another embedding?

What’s the difference between having length 51 vectors and having a length 50 embedding and an extra 1 besides that

Jeremy asked to remind to talk more on using Dropout in training and testing in Pytorch

4 Likes

Thanks!

isn’t the embedding matrices size match properly or broadcasting is used?

EmbeddingDotBias (
  (u): Embedding(671, 50)
  (i): Embedding(9066, 50)
  (ub): Embedding(671, 1)
  (ib): Embedding(9066, 1)
)

I guess fastai still doesn’t support CPU?

Do you mean that the number of users and the number of items (movies) are different, or that they have a different number of factors between the user/item embeddings and the bias embeddings?

The model is running for a single user/movie pair, and uses dense layers to map between the embedding and bias values into a single prediction.

1 Like

It checks out.

10 Likes
1 Like