Forcing CPU usage when a compatible GPU is present


I’m trying to use CPU instead of a GPU, Nvidia Tesla P100, as currently, some other task is running on the GPU, for a Computer Vision task where I’m creating ImageData Bunch and then using that for CNN learner using ResNet-34 architecture.
I’ve gone through the forums and have used
defaults.device = torch.device('cpu')
but it doesn’t seem to be working as fastai is trying to use GPU Memory and CUDA is throwing memory error.

How to fix this issue?

i had the same problem
i got it to work o fastai2 with

1 Like