Fast AI always use GPU on inference

(damien ng) #1

Hello,

I’m trying to set the fastai for inference on CPU. After reading all the recommendations on the forums, I have already tried the difference methods:
Method 1:

   fastai.torch_core.defaults.device = 'cpu'

Method 2:

 defaults.device = torch.device('cpu')

Method 3:

 fastai.torch_core.defaults.device = torch.device('cpu')

However, I always observe that GPU is used when I run the fastai code. Could you help me a solution for this problem?

My conf is Windows 10, fastai 1.x, pytorch 1.x.

0 Likes

#2

Inference is always run on CPU by default and training on GPU. Are you sure there’s not something else going on in the background which is draining GPU?

0 Likes

(damien ng) #3

Yes, I opened Task Manager , and run my code multiple times. GPU was used only when I launched my fastai code, even though I already set the value as my post above. That’s why I am confusing.

0 Likes