Cannot plot confusion matrix in Colab Pro- Out of RAM Error and Runtime Crashes

I have trained a ResNet50 architecture using about 80,000+ images of resolution 576px each (resized to 256px). It caused some problems during the DataLoading, but the overall training process was seamless.

However, I cannot plot a Confusion Matrix. Colab runtime is crashing every time showing a message saying that the runtime crashed after it ran out of RAM.

I am declaring the Learner object and loading the weights from a .pth file.

And then I run-

interp = ClassificationInterpretation.from_learner(learn)


This cell never runs successfully.

It is always stuck at about 79\% . And, yes, I am using a “High-RAM” instance.



Before the error is shown, it’s all normal like any other run of the plotting a CM.

Since Colab did not run into problems during the training but runs into problems during plotting the CM, I think this a problem from fastai rather than from Colab. Any advice on how to circumvent it?

The problem is likely due to that ClassificationInterpretation.from_learner internally call learn.get_preds(..., with_input=True, ...) which results in storing rather big array in memory.
One workaround here can be:

  1. call learn.get_preds manually setting with_inputs=False
  2. construct like ClassificationInterpretation(inputs=[], *get-preds-outputs) - this WILL BREAK plot_top_losses` but you’ll be able to use confusion matrix other stuff which doesn’t use inputs


  1. call learn.get_preds with dl constructed from sample of your validation set and with_input=True
  2. Construct interpretation object using get_preds outputs
    This one is probably more useful if you use sample of reasonable size and think about class representation

Much more elaborated solution would be customizing GatherPredsCallback to write inputs to disk and making Interpretation class to work with files streamed from disk