Lesson 5 - Why don't we shuffle the training set (IMDB)

In lesson 5, the IMDB training set of 25000 examples is split so that the first 12500 examples in the array are positive reviews, and the next 12500 are negative reviews.

When we are training our model batch by batch, does this screw with the weights of the model to be biased towards predicting all positive examples since the first 12500 examples (or the first 195 batches if batch_size=64) are all positive?