Oversampling with WeightedRandomSampler and DeviceDataLoader

I am trying to understand how oversampling a DeviceDataLoader works with WeightedRandomSampler.

I expect that calling new on a 100 sample DeviceDataLoader with oversampling (to 1000 samples) should return a DeviceDataLoader with a 1000 samples, but it returns a DeviceDataLoader with only 100 samples. Below, data.train_dl is the input data (data is an ImageDataBunch):

from random import randint
from torch.utils.data import WeightedRandomSampler


# data.train_dl is a DeviceDataLoader with 100 samples
weights = [randint(1, 10) for _ in range(100)]
nsamples = 1000
sampler = WeightedRandomSampler(weights, nsamples)
dl2 = data.train_dl.new(shuffle=False, sampler=sampler)
print((len(data.train_dl.y), len(dl2.y)))
# outputs 100, 100 (I expect this to be (100, 1000))

Btw, I misunderstood that ImageDataBunch.train_dl was a torch DataLoader, so posted this in the pytorch forums - learnt there that the new method is not in torch DataLoader, only on fastai DeviceDataLoader, so reposting here.