Notation Sympathetic Babi-memn2n Implementation


I have written a “notation sympathetic” version of the [babi network implementation] ( The idea is to use variable names and operations that are as close to the description in the paper as possible. I hope that reading a paper with a notebook that uses familiar variable names will help in cutting down the time it takes to understand it.

So if you have:

a) Not started looking into the babi-memN2N networks, and want to catch up quickly.

b) Are unable to map the code to the paper.

c) Want to refresh some NLP + DL concepts.

you may find the notebook useful.

The notebook starts by describing the input and the preprocessing steps taken (standard tokenization, word to id mapping and such). Since the primary goal of this implementation is pedagogical, terseness has given way to verbosity wherever needed.

If you are familiar with the standard NLP preprocessing, you will find it more useful to directly skip to the model section wherein the code is presented along with the relevant snippet from the paper.

Thanks @jeremy, a lot of what’s in this notebook has been derived from the one you had provided.


EDIT: The above is only for single hop networks.

1 Like

Thanks @amanmadaan! We should incorporate your ideas into the ‘official’ notebook I suspect before the MOOC…

Did you get the same results as me? In particular, how did you go with the “2 supporting facts” task and the multi-hop network. I found it really hard to train!

Hi Jeremy,

This is only for the single-hop network, I got 100% accuracy on the test sentences as well. I am planning to ship out a version with multi hop network as well, but haven’t been able to get the same error rates as mentioned in the paper by being true to the paper (perhaps due to gaps in my understanding and/or bugs).


I was able to get N-hop networks to get the order of accuracy shown in the paper, but not without the following hack:

u(k + 1) = u1 + o(k) instead of u(k + 1) = u(k) + o(k).

Updated at


How interesting! Thanks for working on this :slight_smile: