Multi-label label smooting using callbacks

I have a multi-label vision classification task. However, my training data is potentially noisy and I believe some of the training labels are incorrect. I think label smoothing would help.

I tried to implement label smoothing in the multi-label case using a callback at the beginning of the batch:

class MultiLabelSmoother(Callback):

def __init__(self, epsilon = 0.05):
    self.epsilon = epsilon

def begin_batch(self):
    self.learn.yb = (torch.where(self.y == 1., 1 - epsilon, epsilon), )

The idea is to alter the labels at the start of every batch to the smooth labels. For example the label [1, 0, 0, 1] is changed to [0.95, 0.05, 0.05, 0.95] before the loss is computed (Note that there is no need for the labels to sum to 1 in this case). Then nn.BCEWithLogitsLoss is used to compute the loss from the predictions and the new smooth labels.

However, this seems to have no effect on the training, but no errors are thrown. Is my code correct? Can I actually alter the labels or is it not possible?

Thanks in advance for any help!

My guess is that you can alter the labels. However, there is no such event as begin_batch. Instead you need to define before_batch (check this). This is the reason you don’t see any effect on the training.


I guess my textbook is out of date then! It says to use begin_batch. Now it makes sense why I am not seeing any efffect. Thanks a lot for your help!

I try use this callback, but it seems it does’t change the batch label, is any other method to handle this?

class MultiLabelSmoother(Callback):

def __init__(self, epsilon = 0.05):
    self.epsilon = epsilon

def before_batch(self):
    self.learn.yb = (torch.where(self.y == 1., 1 - self.epsilon, self.epsilon), )