Is my GPU being used

Oh ok, gosh there are so many screws indeed xD.
I also saw people using oil to cool down their computers. Maybe you want to try that at some point? At least you won’t have to remove any screws xD

1 Like

Another useful command is nvidia-smi dmon, which cyclically displays information about various parameters. Here’s a link to nvidia-smi documentation, just search the doc for the word “dmon”. (I saw this command suggested by @Jeremy in this post a few days ago - didn’t realize it, sorry for the duplicate)

My lesson1 fit was running slow and found this post thru search. From running the command you provided, realized I did not install cuda. Thank you! Now that I have installed cuda, is it possible to validate that my jupyter notebook is using GPU?

run

import torch
print(torch.cuda.is_available())

It might help understand if PyTorch is using Cuda.
Also run nvidia-smi on the terminal to see how much memory of your GPU is being used by your python process.

Thanks Ramesh! That was helpful to confirm I am using GPU.

I am running the below line from example lesson1-rxt50. It takes about 8mins per epoch. Is that the normal time taken on GPU on paperspace?

learn.fit(lr, 3, cycle_len=1)

I have the same question. The execution of my epochs of lesson 3 take much time. I checked and pytorch is able to find cuda, and my GPU is detected. All seems to indicate that fastai uses the GPU but am not confident about that due to the execution time.