Hello all, I’m going through this MOOC on a laptop with a GTX 960 GPU and 16GB CPU RAM. I was able to get through lesson 1 by setting the batch size down to 10, but hit a wall in lesson 2 when running the get_data helper function on the training set. I was able to run get_data for the validation set but it looks to take up 5GB of RAM. I went ahead and used bcolz to save the validation data and deleted the variable to free up some memory, but the training data still uses up all 16 GB and 8GB of swap memory before my computer hangs and I have to REISUB and restart.
Looking at htop I see that only one of the four cores in my computer is working at 100% when running the get_data function. Looking at this thread Numpy core affinity , I thought it might be an issue with Importing Numpy and Scipy messing with core affinity but the issue was supposedly resolved in newer versions and I have all the newest versions of Scipy and Numpy installed.