Using a target size that is different to the input size

I’m using a loss function available in PyTorch, however I get the following warning. I’m not sure what I would need to change to ensure they have the same size. I can’t find any reference of this error happening in fastai that proposes a solution to this.

/home/bob/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/basic_train.py:30: UserWarning: Using a target size (torch.Size([32])) that is different to the input size (torch.Size([32, 1])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size.
loss = loss_func(out, *yb)

huber_loss = partial(F.smooth_l1_loss)

learn = cnn_learner(data, 
                    models.resnet152,
                    loss_func=huber_loss)
4 Likes

I assume your problem is solved by now. For everyone else having the same problem, here’s my take on debugging:
Create your own loss-function that does something with the tensors (like printing them to a file) before returning the loss. My approach looks like this (with the example of MSELoss, but this should be applicable universally):

lossfile = "lossfile.txt"
with open(lossfile, 'w') as f:
    print('', file=f)

def fake_MSE_Loss(size_average=None, reduce=None, reduction: str = 'mean'):
    def fake_loss(tens1, tens2):
       with open(lossfile, 'a+') as f:
            print("_______________", file=f)
            print("tens1: " + str(tens1.size()) + "; " + str(tens1), file=f)
            print("tens2: " + str(tens2.size()) + "; " + str(tens2) + "\n", file=f)
        return nn.MSELoss(size_average, reduce, reduction)(tens1, tens2)
    return fake_loss

learn = cnn_learner(data, models.resnet34, loss_func = fake_MSE_Loss())