Error when using PyTorch DataSet with fastai2

Hi all,

I am trying to use fastai2 (0.17.0) with PyTorch DataSets, as mentioned in
https://dev.fast.ai/learner#PyTorch-interrop

It was not clear when to use DataSet and DataLoader from fastai2 vs. PyTorch. I ended up using PyTorch DataSet and fastai2 DataLoader to work around several missing function errors. These may not be the right choices - please let me know.

All inputs and the model are on GPU. The error appears in the Adam optimizer after several fit_one_cycle and lr_find, a CPU/GPU conflict when updating internal averages. The error does not appear with the SGD optimizer.

A demo notebook is attached.

Thanks for helping.

Your github link is dead FYI.

Oops. I had put the notebook into a private repo. Original post edited.

Your error is a device problem. It seems a bit weird as the first lr finder should not work since by default, the DataLoader uses the cpu. You should do dls = DataLoaders(trainDL, validDL).cuda(), or your data won’t be on the GPU.

2 Likes

Thanks, that works.