So, some of the huggingface models will actually return the loss in addition to the predictions.
In such cases, is there a way in v.2 whereby we can simply set the loss to that value without having to define a loss function (which isn’t necessary in this use case)?
class HF_TextGenModelCallback(HF_BaseModelCallback):
def after_pred(self):
if ('labels' in self.xb[0]):
self.hf_loss, self.learn.pred = self.pred[0], self.pred[1]
else:
self.learn.pred = self.pred[0]
def after_loss(self): self.learn.loss = self.hf_loss
And in your Learner, set loss_func=noop.
I’ll have to ask the BDFL about maybe not requiring a loss_func, or at least not calling it, if self.learn.loss is set to something once part 2 gets on its way