Lesson 2 notebook = memory hog?

For those having this problem, we need to figure out how your setup is different to those which work - then it shouldn’t be too hard to fix.

To start with, could you try installing a fresh copy of Anaconda from scratch (you can just move your current one out of the way) and see if it still happens?

I was able to run lesson 2 on my rig and look like it took around ~8GB to process it.
04 PM

I did update both pip and conda in my environment and use the latest fast.ai code

The memory will go down when I terminate the notebook.
Use the following command
ps -eo pmem,pcpu,rss,vsize,args | sort -k 1 -r | less

to hunt for process that use a lot of memory.

Its normal - but I also want Jeremy to respond whether there is a better way of memory management.

Ah ya one thing I noticed jupyter doesn’t release the memory at the end unless manual shutdown.

I just did a git pull, followed by a conda env update ahead of tonight’s class.

First my original notebooks didn’t work, it generated a No module named 'fastai.imports', or looking at an older version, the link to fastai directory is missing from /dl1 directory ?

Second, and more serious for my system, I get this strange error on the 2nd cell of lesson1.ipynb regarding libmkl_intel_lp64.so

Try conda update --all

1 Like