Lesson 6 discussion

Good catch

Hi all!! I have a doubt that may sound silly, but I have been struggling with it way more than half an hour, when training the character predictor, when you are processing the data,

x1 = np.stack(c1_dat[:-2])
x2 = np.stack(c2_dat[:-2])
x3 = np.stack(c3_dat[:-2])

why do you use -2? I have also seen it in the char-rnn

sentences = np.concatenate([[np.array(o)] for o in sentences[:-2]])
next_chars = np.concatenate([[np.array(o)] for o in next_chars[:-2]])

but I cannot seem to find the answer, thanks a lot for your amazing course!

Hi, I’m working through this notebook, specifically the Pure Python RNN! section. I’m having trouble understanding the forward pass function. It looks like this:

def one_fwd(n):
    return scan(one_char, (0,0,0, np.zeros(n_hidden), 0), get_chars(n))

Specifically, the confusing part is the (0,0,0 np.zeros(n_hidden),0). I looked on Theano’s docs, and this appears to be the “sequences” parameter. But I’m not sure that’s correct. The Theano docs say “sequences is the list of Theano variables or dictionaries describing the sequences”. That list aren’t theano variables, so it must represent something else.
So what are those things doing, and why do we have three leading zeros, and one trailing zero, to pad a list of 256 zeros (# of n_hidden). What’s going on here?

Hi, could anyone explain what is the meaning of “Kernel” in this lesson on 46:38 Jeremy says “kernel” showing plot on screen?
And why piece of code is also called kernel? I’m confused. :face_with_raised_eyebrow: