Memory error running ClassificationInterpretation on Kaggle

I went briefly through the forum, but haven’t found exactly the same issue, even though there are plenty of memory issues.

I’m doing image classification using Kaggle notebook, my dataset is 3GB of images with 100 classes, I’m using GPU.

I am able to successfully run this code:

from fastai.vision.all import *
dls = ImageDataLoaders.from_df(df = dataFrame, path = pathToTrainImages, item_tfms=Resize(300,200), valid_pct=0.2, seed=42, bs=8)
learn = cnn_learner(dls, resnet34, metrics=accuracy)
learn.fit_one_cycle(1)

But when I’m trying to interpret the results with
interp = ClassificationInterpretation.from_learner(learn)
The notebook is restarted due to an error:

Your notebook tried to allocate more memory than is available. It has restarted.

Any advice on how to solve this issue is warmly welcomed, and sorry again if I repeat the thread.

2 Likes