@anamariapopescug, yes, and so are academics LOL
Is this a word model or a character model - so if jeremy gave it an incomplete word like toward would it complete that
and while we are at it, another neural network will proof-read it and another shall peer-review it.
Okay last one, the List of Sequentially Torched Matrices (LSTM) committee shall decide whether to accept it or reject it.
Neural Net book/article Editor maybe?
because of cost of computing can be improved if we learn the basics first
like the way we showed smaller images firstâŚthen bigger in CV
language model?
Can someone provide an intuitive explanation(brief)
My best guess is its equivalent to CNNâs Architectures?
itâs a probability distribution over sequences of words. youâre basically learning how likely/unlikely sequences of words are from training data. So youâll learn that âconvolutional neural networkâ has high prob but âconvolutional neural algorithmâ is an unlikely sequence
words are pixels, their relations have meanings like pixels express images.
You model those relationships to then be able to classify/predict words in the same space/domain
This blog introâs it well - by one of the thought leaders:
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
hopefully, fake news classification too
What is the difference between this and word2vec from Google
If anyone needs understanding around âWord Embeddingsâ without diving deeper into RNNs
The first half of this post talks about it.
Only RNNs take IMDB reviews so seriously!!
They way you learn the embeddings is different.
Is vocal similar to bag of words?
what about removing stopwords?
why would we not use word vectors?
Stopwords are important in this kind of problem
Yeah canât say that the
is all that important in classification.