What's the correct way to change loss function?

I’m having trouble about changing the default loss function in a cnn_learner.

I’ve seen these code on internet:

class L1LossFlat(nn.Loss):
def forward(self, input:Tensor, target:Tensor) -> Rank0Tensor:
return super().forward(input.view(-1), target.view(-1))

To change the default loss function, it did this:

learn = create_cnn(data, models.resnet34)
learn.loss = L1LossFlat

But I’ve discovered the loss_func attribute and I saw another code:

learn.loss_func = L1LossFlat()

When I do the first method and print learn.loss_func it shows the default loss and not the one I want. However none error is throw.
My question is: is both methods equivalents? If I do the first one it’ll train using the changed loss or the default one?

I’m using the last version of fastai on colab.

You should pass the loss_func to the call to cnn_learner. IE:

learn = cnn_learner(data, model, loss_func=MyLoss())

learn.loss = L1LossFlat

So this approach is wrong?

Yes. I don’t quite recall what loss is, but you should override loss_func