Zero GPU utilization

I have a Quadro K2200 GPU but when I run the vgg training both GPU and memory(for GPU) utilization is 0, why? Does it mean that the training is not using GPU?

Hello Mihar,

You need to provide more information about your setup (OS, Cuda/cuDnn/GPU driver versions, etc.) and also copy/paste the message that tells you “GPU … utilization is 0”, then it’s easier to help and direct you to the right solution.

Eric

You can try
nvidia-smi -l 1
On the command line. This gives you an overview of GPU utilization and updates at 1 s intervals.

Are you using theano or tensorflow? And which python version? I recently built my DL machine and the CUDA sample test result was ok. But after installing python 3.6, theano and tensorflow, I tried to run Lesson 1 Vgg16 model and like your case, my GPU’s usage was 0-3% percent.
I then went to Tensorlfow and Theano tutorial/installtion pages and ran their simple scripts to test if they are using GPU. Tensorflow test passed but not Theano. I could run part 2 lessons which uses Tensorflow and my GPU usage are 80-90%.

I found out that I had installed Theano using Conda but it is currently not supporting python 3.6. I have not tried installing it using pip or binary source since I don’t need to run Theano right now.
I thought my experience might help you resolve this issue.

It is a bit later reply, but I just removed the driver and reinstalled again, the it was able to pickup the GPU.

I noticed that I am having problems with installing CUDA and NVIDIA drivers using PPAs (i.e. using the .deb packages from NVIDIA). In theory it is nice, because the drivers pull automatic updates. In practice the updates tend to break thinks within the DL stack because of driver incompatibilities between CUDA and tensorflow. These libraries are just moving too quickly. So, I ended up using a static CUDA installer (i.e. .run files from NVIDIA) and the TF python wheel through pip install. This way my DL stack doesn’t get broken at random when I run apt-get upgrade to install security patches.