Word embedding for Keras Reuters news topic classification

In lesson 5, Jeremy achieved a good result on IMDB review classification using a simple conv net.

There is another Keras database on Reuters news topic classification. When I tried using the word embedding method for this task, I got terrible results. Tuning hyper-parameters didn’t seem to help. (enlarging embedding size; using sparse categorical and softmax for last layers)

I saw a one-hot encoding approach that produced a decent result. https://github.com/fchollet/keras/blob/master/examples/reuters_mlp.py

Is word embedding really a poor choice for this task? Or is my approach wrong?

Thank you.