Old GPU useless?


I have quite an old laptop, but with a separate GPU.
I managed to install CUDA and everything, but when I run the first training the GPU runs out of memory.
Is there any way I can change the parameters to get it to run at all or is it not worth bothering?
The GPU is GeForce GT 730M with 1 GB of memory.


IMHO, it is not worth bothering… minimum 8GB for this course…

agreed - nothing below 8GB is worth your time. Depending on your resources and requirements, it might be worth doing everything that doesn’t involve model training on your local first, and only move up to the cloud just for training.
To give you an idea, my laptop has an inbuilt Quadro M1000M with 4GB of memory. I was able to train the language model of the imdb example with each cycle taking 5hr 40min. When I got to classification part, my memory simply cannot handle the fully unfrozen model even with batch size = 2.
My company has just committed to Azure, so I have to operate on that environment, and it is quite expensive. As a result, I do all my coding and pre-processing on my local machine and only switch to Azure when I actually train my model. I’m finding this to be the most cost efficient way for my set up.