Yes.
Remember that the actually flipping of the tokens happens during mini-batch creation (see the pad_collate function here). So one way of verifying this is by creating a batch and looking at the tokens … for example:
pad_collate
b = next(iter(data.train_dl)) print(b[0])
I followed this and then used the following to confirm.
vocab = data.vocab
b = next(iter(data.train_dl)) print(b[0][0])
then paste the output of b into this:
vocab.textify()
This confirmed for me that the text was indeed being presented backwards.