I’m trying to implement a Unet
architecture for binary semantic segmentation which has PyTorch
’s weighted BCELoss
as loss function:
learn = unet_learner(dls,
resnet34,
loss_func=BCELoss(weight=weights),
opt_func=ranger,
self_attention=True,
act_cls=Mish,
).to_fp32()
but I get this error:
ValueError: Using a target size (torch.Size([32, 256, 256])) that is different to the input size (torch.Size([32, 2, 256, 256])) is deprecated. Please ensure they have the same size.
How can I get rid of the dimension in excess?
Does the Fastai
library have a specific binary weighted loss?