Why my computer crashes sometimes while running code on Jupyter Notebook?

I have 1080 Ti on my computer and it is still crashing sometimes when I’m going through a big dataset. Once it run properly a cell and after that mouse started moving slower and it just crashed. By crashing I mean my mouse start moving very slow and finally, it doesn’t move. My os is Linux Ubuntu. Is this normal or am I doing something wrong? One time I noticed that cell I ran was running only on one core and it still crashed.

If you’re reading a large dataset into memory (RAM) then you could definitely experience the issues you’re seeing. Try running the htop command in your terminal the next time it happens and see if the RAM is full.

How I can fix this problem? How I can read dataset somewhere else than RAM.

Don’t load all the dataset to the RAM at once, you can for example load a mini-batch at the time with an iterator.

1 Like

If I load let’s say 1,000 images in my jupyter notebook is it taking same amount of RAM all of the time until I shutdown the kernel? So should I shutdown my kernel after I stop working with that notebook anymore and start doing something else?

It depends, if you close the images at the end the memory should be freed. But it’s probably better if you shutdown the kernel so that all the RAM is freed. Anything you need to keep like model weights you can save to the disk and load back later.

Thanks!