I am currently training a UNet for the Severstal Kaggle competition.
For the competition, I’ve chosen the Dice coefficient as the metric. When creating my learner and training it the dice coefficient is more often than not above 1.
The code is as follows:
arch = models.resnet34 learn = unet_learner(data, arch, metrics=[dice]) lr_find(learn) learn.recorder.plot() lr=1e-04 learn.fit_one_cycle(5, slice(lr), pct_start=0.9)
The output is:
|epoch|train_loss|valid_loss|dice |time |
|0 |0.105920|0.100106 |0.556928|02:34|
|1 |0.100063|0.094888 |1.334559|02:30|
|2 |0.090685|0.077091 |1.150504|02:33|
|3 |0.085975|0.071968 |1.518277|02:31|
|4 |0.066519|0.061476 |1.759492|02:31|
For the learner I am using a single channel mask with classes [0,1,2,3,4].
Why is this metric above 1? Does it require refactoring to account for multi labels?