Did sequence length and batch size switch dimensions for text classification models?

Running my text classification models and looking at the outputs and raw_outputs document vectors and it now looks like the shape is (batch_size, seq_len, n_hid) … whereas before they’d come back as (seq_len, batch_size, n_hid)

Same thing when I run a single item through my model. I used to create a one-item minibatch with shape (seq_len, 1), but notice that now it has to be in the shape of (1, seq_len)

Did something change?