Notebook kernel dies getting mnist std deviation

i’m trying to get part 2 running on my local machine which is cheap and weedy and doesn’t even have a gpu (i was hoping to get a basic library actually running locally before i push it to Colab and give it a beating).

the notebook kernel dies every time i try and get x_train.std() and i don’t see anywhere giving me a clue why either.

any ideas?

Maybe n=0?

Is x_train.std?? what it should be?

inf’s or nan’s in x_train?

n = 5000

until I tried to normalise, all the sanity checks passed so i think the data is fine. also x_train.mean() is working so rules out (?) infs or nans.

i wonder if it’s running out of memory but it’s not telling me anything useful and there don’t seem to be any logs.

i’ve hacked it for now by “normalising” with /= 255 but that’s poop and i’d like to fix it properly.

Hey Joe,

We can’t tell without looking at the code and the machine specs, but I tend to agree with your suspicion that it’s related to memory. Could you try it again with 10 images and then keep increasing it for a while and look at the RAM usage and see if that could be the case?