How to make sure that the code is using your GPU?

When I try to run the lesson 1 code, my laptop completely freezes. Also I have a 2GB NVIDIA card, but when I enter ‘nvidia-smi’ on a new terminal while the code is running in jupyter, I see that the GPU is not being kept busy at all. Only about 80 mb is being used which is the case even when I am doing something else on my laptop.
I have CUDA and cuDNN and all the necessary stuff. I still can’t understand what to do to make the stuff run on my GPU.

ps:I am on an Ubuntu16.04 machine.

may be try more layers in your network? what network are you using?

I’m using the code given in the first lesson, the vgg16 one.

I don’t remember the details from lesson 1 but for TF make sure to install tensorflow-gpu. It should then use the GPU card automatically. Or if using Theano backend, see http://deeplearning.net/software/theano/tutorial/using_gpu.html

1 Like

thanks that worked :slight_smile: