@anamariapopescug, yes, and so are academics LOL

# Lesson 4 In-Class Discussion

**karthikramesh**(Karthik Ramesh) #184

Is this a word model or a character model - so if jeremy gave it an incomplete word like toward would it complete that

**pramod.srinivasan**(Pramod) #186

and while we are at it, another neural network will proof-read it and another shall peer-review it.

Okay last one, the List of Sequentially Torched Matrices (LSTM) committee shall decide whether to accept it or reject it.

**neovaldivia**(Rafael Valdivia) #188

because of cost of computing can be improved if we learn the basics first

like the way we showed smaller images firstâŚthen bigger in CV

**ecdrid**(Aditya) #189

language model?

Can someone provide an intuitive explanation(brief)

My best guess is its equivalent to CNNâs Architectures?

**anamariapopescug**(anamariapopescug) #190

itâs a probability distribution over sequences of words. youâre basically learning how likely/unlikely sequences of words are from training data. So youâll learn that âconvolutional neural networkâ has high prob but âconvolutional neural algorithmâ is an unlikely sequence

**neovaldivia**(Rafael Valdivia) #191

words are pixels, their relations have meanings like pixels express images.

You model those relationships to then be able to classify/predict words in the same space/domain

**beacrett**(Ben Eacrett) #193

This blog introâs it well - by one of the thought leaders:

http://karpathy.github.io/2015/05/21/rnn-effectiveness/

**PranY**(Pranjal Yadav) #196

If anyone needs understanding around âWord Embeddingsâ without diving deeper into RNNs

The first half of this post talks about it.