Question on DataBlock from Pytorch Dataset

I hope this question isn’t too annoying. I would like to use the torchvision.datasets with fastai. In particular, I am looking to convert a torch vision dataset such as FashonMNIST to a fastai DataBlock. I am aware of some options including posts on going from pytorch datasets to fast.ai data loaders (such as Create DataLoader from Pytorch Dataset).These partially work but for me they have one problem or another. Suppose I load the datasets like this:

train_pt_dataset = torchvision.datasets.FashionMNIST(root=‘./images’,train=True, download=True)
test_pt_dataset = torchvision.datasets.FashionMNIST(root=‘./images’,train=False, download=True)

I can turn them into torch.utils.data.DataLoader objects then then into fastai.data.core.DataLoaders. Unfortunately, that would require me to skip making a DataBlock and forgo the benefits of fastai’s DataBlock API such as access to show batch as shown here: Zach Mueller - Pytorch to fastai, Bridging the Gap

I can use the DataBlock.datasets method on, for example, the train_pt_dataset to get a DataBlock. However, if I do that the splitter will split the training data into validation and testing. If I want test_pt_dataset as the validation data, then that doesn’t work. I wanted to find a way to concatenate them into one dataset and then write a splitter that grabbed the last sample. I don’t see a good way to do this.