Lesson 6 In-Class Discussion


(yinterian) #62

It may be easier to try a bit from scratch.

Look at an example here in which you get to use the fit function from the library but you can specify everything else.


#63

Can we get a link for the blog post he’s talking about now?


(Suvash Thapaliya) #64

@jeremy I’m hoping you really did go to Nepal in 2009. \o// :sunny:


(ecdrid) #65

The notation reminds me of The Deathly Hallows


(Vitaly Bushaev) #66

why not concatenate them together and have a bigger FC2 ?


(Kevin Bird) #67

Is there any advantage to predicting a character instead of a word? It seems like that would always be less accurate.


(Erin Pangilinan) #68

Which one? The one by Andrej? http://karpathy.github.io/2015/05/21/rnn-effectiveness/


#69

That looks like it. Thank you <3


(James Requa) #70

We are covering RNN’s for text but could you also use them to capture temporal features from an image sequence (i.e. video)?


(Pete Condon) #71

Absolutely (an LSTM is a variant of RNN):




(layla.tadjpour) #72

in cell 13, why do x = np.stack(c1_data[:-2]) and not np.stack(c1_data)?


(James Requa) #73

@pete.condon Awesome, thanks for the link! Would be cool to implement something like this in pytorch/fastai :slight_smile:


(Erin Pangilinan) #74

I love Andrej he is the best, just met him recently. =) I totally fangirl’ed him.


(Charin) #75

Andrej = Best drej


(Louis Guthmann) #76

It allows to avoid out of word kind of issues for a start and provide much more robust model for non-classical NLP issue such as the latex generation one.


(Erin Pangilinan) #77

My bad replied on the wrong thread, meant to send it to @jenna. Heh.


(Karthik Ramesh) #78

@yinterian is for loops on GPU a good idea?


(ecdrid) #79

What’s the link for Jeremy’s presentation ?
@yinterian(sorry)


(Kevin Bird) #80

Where is F being defined? I’m assuming this is the same F as softmax, but where does that come from?


(ecdrid) #81

import torch.nn.functional as F