Cool little text generation example

sample_model(m, 'cognitive science is important because ')
wt103RNN returns: it is a good way to understand the nature of the mind .

1 Like


@matterhart the sample return from wt103RNN seems awesome.

How do you produce that ?
Feeding wt103RNN with the sentence and then greedy selection of the highest next probability ?

Or do you use beamsearch ?

That being said, where is this wt103rnn model ? I could only find WT103_1 in the IMDB text example.