GPU on laptop

Hi.I gos GTX Geforce 960m graphic card on my laptop, but its seems like I am using cpu when during model training etc. Can i do something about it?

Depends of what course are you doing …Ml1 or Dl1 or Dl2 also … depends of the environment variable you used to update your environment (environment.txt or environment-cpu.txt) and at last also depends of which lesson.
Some lessons at least in Machine Learning course does not uses GPU all the time.

Running nvidia-smi during the work load you can see if the GPU is being used or not.

I was thinking about more general case.For example when I am in jupyter and I was using Tensorflow it took 20s for one epoch when the same thing on pc took like 3s

Ahhh sorry… Now I get it… but this is the way you can think:

To know where is the impact you need to calculate the “computations” for each layer and it depends on numerous properties: the number layers, batch size and epochs. Do you have this information at hand ?

  • How many layers your algorithm have ?
  • What is the size of the dataset and how many batches or what is the batch size (one calculate the other)?
  • What are the properties of each layer (ie: convolution: Num filters, kernels, input , output dimensions, )

Only knowing how the model attacks the data you can predict how much intense is the worload

Also, do you have the same GPU from Notebook on the Desktop on PC ?
Cuda Cores are different between boards and your Notebook GPU does not exist on desktop (Nvidia Series M)
Did you mention earlier that you have GTX 960m on notebook (that owns only 640 cuda cores - it’s like 640 threads, but not the same thing)

What is the model of your GPU on the Desktop then ?

Note: Tensorflow is totally different product than pytorch. But to compare each other then you need implement the same algorithmic problem in both and do the benchmark and yet on the same machine.

It’s way too low now…
I also have the same so I know…
(Better at Junkyard and it’s not at all a GPU…)

Hi Bart,
IMO, if you don’t any GPU PC for training your model, you can use Google Colab instead. We have K80 graphic card on cloud for free, and it’s better than you GTX 960m.