Is there a way to simply set the loss without defining a loss function?

So, some of the huggingface models will actually return the loss in addition to the predictions.

In such cases, is there a way in v.2 whereby we can simply set the loss to that value without having to define a loss function (which isn’t necessary in this use case)?

Why not just make a dummy loss function?

IE:

def myLoss(inp, loss): return loss

How do you pass in the loss to that function?

Ahhh. Sorry missed the fact there was a loss too (with the outputs) It would be:

def HFLoss(inp, out, loss): return loss

*i think. I may do some small tests when I have time but this should work IIRC

But how do you pass the loss into that function? By default, fastai passes in the inputs and targets … nothing else.

I tried do thing, hoping that if I set self.learn.loss I wouldn’t need a loss function … but no glory:

class HF_TextGenModelCallback(HF_BaseModelCallback):  
    def after_pred(self): 
        if ('labels' in self.xb[0]):
            self.learn.loss, self.learn.pred = self.pred[0], self.pred[1]
        else:
            self.learn.pred = self.pred[0]

SOLVED (well, at least a solution)

class HF_TextGenModelCallback(HF_BaseModelCallback):  
    def after_pred(self): 
        if ('labels' in self.xb[0]):
            self.hf_loss, self.learn.pred = self.pred[0], self.pred[1]
        else:
            self.learn.pred = self.pred[0]
            
    def after_loss(self): self.learn.loss = self.hf_loss

And in your Learner, set loss_func=noop.

I’ll have to ask the BDFL about maybe not requiring a loss_func, or at least not calling it, if self.learn.loss is set to something once part 2 gets on its way :slight_smile:

Thanks for the replies Zach!

1 Like