Help understanding calls to pred_batch

I am trying to understand under what circumstances pred_batch shuffles data that is passed to it. I am currently running fastai version 1.0.58, and am performing the following:

#---data features is data frame containing test data
learn = load_learner(path, test=ItemList.from_df(data_features) )
learn.data.batch_size = data_features.shape[0]
preds = learn.pred_batch( ds_type = DatasetType.Test )

I have tried this a couple times, and am convinced there is no shuffling of the test data, as the preds are identical each time. This is returning predictions on my entire test set, as I specify the batch size to be equal to the size of data_features. Luckily, returning identical predictions is the performance I want. However, I cannot seem to reconcile this with the fastai code. Since I am not specifying a batch in the pred_batch call, the function should be calling one_batch, which performs a next( iter(dl) ). Hence, I think the data should be getting shuffled, and thus different predictions being returned. Or is the answer that the test dl does not shuffle, and next(iter) is merely stepping through the list of data? If so, could someone point out where this is specified in the fastai code? Perhaps my understanding of dl is flawed.

load_learner calls add_test which created a DataLoader with shuffle=False. You can see in DataLoader init that a SequentialSampler is used when shuffle = False.