I am trying to run notebook from the second lesson on the sample data. I have GTX 960M with 1 Gb of memory and 8 Gb main RAM.
This code gives me MemoryError
from vgg16 import Vgg16
vgg = Vgg16()
model = vgg.model
Error allocating 411041792 bytes of device memory (out of memory).
Is it about GPU or main memory?
I couldn’t find nvidia-smi command on macOS though I have CUDA installed and working. What is the way to check GPU memory consumption on macOS?
And the last question will it be possible to do the course exercises on sample data with this machine? I am ok with running p2 instance to fit the whole data but currently I spend a lot of time just to figuring out how smth works in Python.
@jeremy, I will try cuda-smi today, thank you for the link
@maral, thank you for the note on external monitor - I indeed use one. batch_size has no effect because I encounter an error when instantiating the model, before ever loading data to it.I guess the weights dataset doesn’t fit the memory I have. I will verify with cuda-smi
The batch_size modification will not take effect until you restart the kernel in Jupyter. I’m running the notebooks locally on both Windows and Mac machines and was able to run things fine with a batch size = 8.