I’m trying to build up a portfolio of deep learning projects and turn them into blog posts. I recently built a Seq2Seq translation model with attention and made a pretty detailed blog post walking through the conceptual details. There is also a jupyter notebook that walks through building and training the model.
This was the first in depth post that I’ve wrote so I would really appreciate some feedback. I spent some time making some cool illustrations for it so its worth checking out just for that.
You can see it here: