Lesson 6 In-Class Discussion

It may be easier to try a bit from scratch.

Look at an example here in which you get to use the fit function from the library but you can specify everything else.

1 Like

Can we get a link for the blog post he’s talking about now?

3 Likes

@jeremy I’m hoping you really did go to Nepal in 2009. \o// :sunny:

2 Likes

The notation reminds me of The Deathly Hallows

3 Likes

why not concatenate them together and have a bigger FC2 ?

Is there any advantage to predicting a character instead of a word? It seems like that would always be less accurate.

Which one? The one by Andrej? http://karpathy.github.io/2015/05/21/rnn-effectiveness/

That looks like it. Thank you <3

2 Likes

We are covering RNN’s for text but could you also use them to capture temporal features from an image sequence (i.e. video)?

4 Likes

Absolutely (an LSTM is a variant of RNN):



7 Likes

in cell 13, why do x = np.stack(c1_data[:-2]) and not np.stack(c1_data)?

@pete.condon Awesome, thanks for the link! Would be cool to implement something like this in pytorch/fastai :slight_smile:

1 Like

I love Andrej he is the best, just met him recently. =) I totally fangirl’ed him.

3 Likes

Andrej = Best drej

1 Like

It allows to avoid out of word kind of issues for a start and provide much more robust model for non-classical NLP issue such as the latex generation one.

1 Like

My bad replied on the wrong thread, meant to send it to @jenna. Heh.

@yinterian is for loops on GPU a good idea?

What’s the link for Jeremy’s presentation ?
@yinterian(sorry)

Where is F being defined? I’m assuming this is the same F as softmax, but where does that come from?

import torch.nn.functional as F

2 Likes