Bug with plot_top_losses

I think there might be a bug at plot_top_losses() function.

Some plots from plot_top_losses() were weird so I decided to try the function implemented by @quan.tran at grad cam. Here are the results:
From plot_top_losses():
image

And from @quan.tran:
image

For some reason, for this specific image, plot_top_losses() didn’t assign any colors to the image. Any ideas why? This happens to other images as well.

And here’s another example where plot_top_losses() assigns colors but different from quan’s:
First, from plot_top_losses():
image
And from @quan.tran 's implementation:
image

1 Like

For some reason there is a threshold. Try setting heatmap_threshold to 0 and then see?

The plot_top_losses function in fastai is a direct implementation from the 2017 Grad-cam paper. In my implementation, I made a small change in the final step (instead of RELU((activation*grad).sum(0)), I just do mean(activation*grad) for simplicity). So fastai grad-cam should signify all the spots that contribute to the prediction (sparse but dense heat area), and my grad-cam still show those spots, but they are kinda spread out a bit. My heatmap is a bit brighter than fastai’s because I set alpha high (0.8) in ax.imshow (it’s 0.4 in fastai’s). There is really no bug here, just some differences in the implementation.

By the way, the heatmap_thresh is just to make sure the inner cnn layer (where the grad for heatmap is calculated) is big enough to produce the heatmap. For example, In Resnet34 this layer size is (512,7,7) and since default value of thres = 16 which is < 49 (7*7), heatmap will be calculated and plotted.

3 Likes