It took a very long time to calculate the confusion matrix with 3000 classifications.
Converting from Tensor to Numpyās Array, the calculation time is reduced to 1/3.
Tensor
Blockquote
import time
t1 = time.time()
interp.confusion_matrix()
t2 = time.time()
elapsed_time = t2-t1
print(f"ēµéęé(sec)ļ¼{elapsed_time}")
<
ēµéęé(sec)ļ¼982.7583639621735
Numpy
Blockquote
import time
t1 = time.time()
DATA = interp.data
PRED_CLASS = interp.pred_class.cpu().numpy()
Y_TRUE = interp.y_true.cpu().numpy()
#x=torch.arange(0, DATA.c)
x = np.arange(0,DATA.c)
slice_size = 1
#cm = torch.zeros( DATA.c, DATA.c, dtype=x.dtype)
cm = np.zeros( (DATA.c, DATA.c), dtype=x.dtype)
for i in range(0, Y_TRUE.shape[0], slice_size):
cm_slice = ((PRED_CLASS[i:i+slice_size]==x[:,None])
& (Y_TRUE[i:i+slice_size]==x[:,None,None])).sum(2)
#torch.add(cm, cm_slice, out=cm)
np.add(cm, cm_slice, out=cm)
#to_np(cm)
t2 = time.time()
elapsed_time = t2-t1
print(f"ēµéęé(sec)ļ¼{elapsed_time}")
<
ēµéęé(sec)ļ¼341.22110533714294