Nietzsche notes! lesson 6

Hi all I wanted to share my lesson 6 notes. As @timlee mentioned alot to cover in this lesson. I hope this will be of benefit to some. :grin:

lesson6-rnn_notes.pdf (1.4 MB)


Very nice! Maybe add a link here from the wiki thread?


Done! :grinning:

It was Cool… Thanks @amritv for ur effort…

1 Like

If you don’t mind an ignorant question, how do you make these? Do you take notes during the live feed? Do you re-watch the video and make notes? Do you make your own notes like this? Is there a special/specific app you use for these type of notes?

@bhollan, I make the notes after re-watching the videos. Making these notes helps me better understand the code and I also use this process for other stuff. No special apps, I use acrobat pro, save the ipynb as an html and then use acrobat pro to save as a .pdf. Acrobat allows the use of the pen or finger to annotate the notes. Hope that helps.

1 Like

Yup! Thanks!

You might want to update your notes for the multi-output model loss function evaluator. At 2:03:20 Jeremy says that nh is 256. However, at 2:05:20 he says that nh is 84. I checked, and the correct size is 84 (which makes sense since you’re evaluating the loss on your prediction of 1 of 84 classes).

1 Like

@DavidBressler thanks for the info. Ill update this as soon as i can. :+1:

1 Like

My notes if someone is interested to read same thing again a little bit differently.

Hey, I have a question on the part where we are creating input.

c1_dat = [idx[i] for i in range(0, len(idx)-1-cs, cs)]
c2_dat = [idx[i+1] for i in range(0, len(idx)-1-cs, cs)]
c3_dat = [idx[i+2] for i in range(0, len(idx)-1-cs, cs)]
c4_dat = [idx[i+3] for i in range(0, len(idx)-1-cs, cs)]

When we specify the range, why do we need to substract cs from the len(idx) ?