Good catch
Hi all!! I have a doubt that may sound silly, but I have been struggling with it way more than half an hour, when training the character predictor, when you are processing the data,
x1 = np.stack(c1_dat[:-2]) x2 = np.stack(c2_dat[:-2]) x3 = np.stack(c3_dat[:-2])
why do you use -2? I have also seen it in the char-rnn
sentences = np.concatenate([[np.array(o)] for o in sentences[:-2]])
next_chars = np.concatenate([[np.array(o)] for o in next_chars[:-2]])
but I cannot seem to find the answer, thanks a lot for your amazing course!
Hi, Iâm working through this notebook, specifically the Pure Python RNN!
section. Iâm having trouble understanding the forward pass function. It looks like this:
def one_fwd(n):
return scan(one_char, (0,0,0, np.zeros(n_hidden), 0), get_chars(n))
Specifically, the confusing part is the (0,0,0 np.zeros(n_hidden),0). I looked on Theanoâs docs, and this appears to be the âsequencesâ parameter. But Iâm not sure thatâs correct. The Theano docs say âsequences is the list of Theano variables or dictionaries describing the sequencesâ. That list arenât theano variables, so it must represent something else.
So what are those things doing, and why do we have three leading zeros, and one trailing zero, to pad a list of 256 zeros (# of n_hidden). Whatâs going on here?
Hi, could anyone explain what is the meaning of âKernelâ in this lesson on 46:38 Jeremy says âkernelâ showing plot on screen?
And why piece of code is also called kernel? Iâm confused.