It may be easier to try a bit from scratch.
Look at an example here in which you get to use the fit function from the library but you can specify everything else.
It may be easier to try a bit from scratch.
Look at an example here in which you get to use the fit function from the library but you can specify everything else.
Can we get a link for the blog post he’s talking about now?
The notation reminds me of The Deathly Hallows
…
why not concatenate them together and have a bigger FC2 ?
Is there any advantage to predicting a character instead of a word? It seems like that would always be less accurate.
That looks like it. Thank you <3
We are covering RNN’s for text but could you also use them to capture temporal features from an image sequence (i.e. video)?
Absolutely (an LSTM is a variant of RNN):
in cell 13, why do x = np.stack(c1_data[:-2]) and not np.stack(c1_data)?
@pete.condon Awesome, thanks for the link! Would be cool to implement something like this in pytorch/fastai
I love Andrej he is the best, just met him recently. =) I totally fangirl’ed him.
Andrej = Best drej
It allows to avoid out of word kind of issues for a start and provide much more robust model for non-classical NLP issue such as the latex generation one.
Where is F being defined? I’m assuming this is the same F as softmax, but where does that come from?
import torch.nn.functional as F