Lesson 6 In-Class Discussion

(yinterian) #1

This thread contains the in-class discussion from Lesson 6. The Wiki links have been moved to this new thread. Please ask any questions about lesson 6 in the new wiki thread.

(Vikrant Behal) #2

Waiting for the link to the lecture 6.

(Ibrahim El-Fayoumi) #3

Me too… it is about RNN today, interesting topic

(Vitaly Bushaev) #5

stream working great!

(Parthasarathy Mohan) #6

can u republish the link please

(James Dietle) #7

This is better than NIPS!

(yinterian) #8

It is on at the top of the topic.

(Eric Perbos-Brinck) #9

sound cuts every 20 sec or so for a sec

(Nafiz Hamid) #10

Is this the last lecture?

(Pramod) #11

its getting better

(Pramod) #12


(Vikrant Behal) #13

I love the jokes Jeremy cracks! :smiley:

(Kevin Bird) #14

Why do we have to have the extra column instead of just adding another embedding?

What’s the difference between having length 51 vectors and having a length 50 embedding and an extra 1 besides that

(Vitaly Bushaev) #15

Jeremy asked to remind to talk more on using Dropout in training and testing in Pytorch

(yinterian) #16


(ecdrid) #17

isn’t the embedding matrices size match properly or broadcasting is used?

EmbeddingDotBias (
  (u): Embedding(671, 50)
  (i): Embedding(9066, 50)
  (ub): Embedding(671, 1)
  (ib): Embedding(9066, 1)

(Vikrant Behal) #18

I guess fastai still doesn’t support CPU?

(Pete Condon) #19

Do you mean that the number of users and the number of items (movies) are different, or that they have a different number of factors between the user/item embeddings and the bias embeddings?

The model is running for a single user/movie pair, and uses dense layers to map between the embedding and bias values into a single prediction.

(Kevin Bird) #20

It checks out.

(ecdrid) #21