Learner metrics: accuracy works but MatthewsCorrCoef not

I am using fastai. This code works fine:

learner = Learner(dls, model, metrics = accuracy, loss_func = CrossEntropyLossFlat())
learner.fine_tune(epochs = 5, freeze_epochs = 5,base_lr = 10e-3)

But replacing accuracy with MatthewsCorrCoef …

learner = Learner(dls_CV, model, metrics = MatthewsCorrCoef, loss_func = CrossEntropyLossFlat())

I get this error:
TypeError: MatthewsCorrCoef() takes from 0 to 1 positional arguments but 2 were given

I do not understand this since I do not pass any arguments. What is the difference to applying accuracy?

accuracy directly accepts the model output and targets, while MatthewsCorrCoef needs to be initialized to work. You can tell the difference in the future via the function lowercase vs class capitalization (MatthewsCorrCoef is a method which returns a class).

This will work:

learner = Learner(dls_CV, model, metrics = MatthewsCorrCoef(), loss_func = CrossEntropyLossFlat())

Thanks for your explanation, it works!