Classification Interpretation runs out of CPU memory

Hi!

I am running a notebook with fastai v2. The training step goes smoothly but, when I tried to get results using ClassificationInterpretation.from_learner(learn) the process is killed because it run out of CPU RAM.

Any ideas why is this happening?

Thanks!

1 Like

Without more information, no.
In general people can’t help without knowing the kind of problem you are trying to solve and see the code you’re running.

Hi @sgugger,

Apologize for message. Here’s a bit more of context:

I downloaded this dataset from Zenodo. Concretely, I am working with CRC_* folders.

I run the following code:

from fastai2.basics import *
from fastai2.vision.all import *
from fastai2.callback.all import *
np.random.seed(2)
batch_tfms = [*aug_transforms(size=224, max_warp=0), Normalize.from_stats(*imagenet_stats)]
item_tfms = RandomResizedCrop(224, min_scale=0.75, ratio=(1.,1.))
bs=32
path = '~/DeepLearning/Datasets/COAD_MSI/'
data = ImageDataBunch.from_folder(path, batch_tfms=batch_tfms, 
                                   item_tfms=item_tfms, bs=bs,seed=2)
opt_func = partial(ranger, mom=0.9, sqr_mom=0.99, eps=1e-6, beta=0.)
learn = cnn_learner(data, xse_resnext50, pretrained=False, wd=1e-2, opt_func=opt_func, metrics=[accuracy, Precision(), Recall(), RocAuc()], cbs= MixUp()).to_fp16()
lr = 1e-2
learn.fit_one_cycle(20, lr, wd=1e-2, cbs=[ShowGraphCallback(), SaveModelCallback(monitor='accuracy', fname='best_mixup_ranger_seRxN50')])

Then I loaded the best accuracy model in a new session loading all the necessary imports and variables:

learn = learn.load('best_mixup_ranger_seRxN50')

And tried to use the Classificacion Interpretation:

interp = ClassificationInterpretation.from_learner(learn)

I am using Psensor to monitor the GPU and CPU spects and, as the ClassificationInterpretation runs, I see how the free memory drops. When it arrives to 0% the notebook is restarted. I also tried in the python console and I found the same behaviour.

Hope now is clearer. Again, sorry for the previous post.
Cheers!

Guessing your dataset doesn’t fit in RAM (it needs inputs, targets and predictions on top of the losses so maybe it’s a lot). You should try on a partial subset.