F1 Score as metric


#21

Didn’t have time to check that issue yet, so I’m not sure.


(Azarudeen) #22

@wyquek I tried to use Fbeta_binary. I got below error.

NameError: name ‘clas’ is not defined


(魏璎珞) #23

try this

@dataclass
class Fbeta_binary(Callback):
    "Computes the fbeta between preds and targets for single-label classification"
    beta2: int = 2
    eps: float = 1e-9
    clas:int=1
    
    def on_epoch_begin(self, **kwargs):
        self.TP = 0
        self.total_y_pred = 0   
        self.total_y_true = 0
    
    def on_batch_end(self, last_output, last_target, **kwargs):
        y_pred = last_output.argmax(dim=1)
        y_true = last_target.float()
        
        self.TP += ((y_pred==self.clas) * (y_true==self.clas)).float().sum()
        self.total_y_pred += (y_pred==self.clas).float().sum()
        self.total_y_true += (y_true==self.clas).float().sum()
    
    def on_epoch_end(self, **kwargs):
        beta2=self.beta2**2
        prec = self.TP/(self.total_y_pred+self.eps)
        rec = self.TP/(self.total_y_true+self.eps)       
        res = (prec*rec)/(prec*beta2+rec+self.eps)*(1+beta2)
        self.metric = res 

If you want F1 for label 1

learn = text_classifier_learner(data_clas, drop_mult=0.5)
learn.load_encoder('fine_tuned_enc')
learn.metrics=[accuracy, Fbeta_binary(beta2=1,clas = 1)]

OR if you want F1 for label 0

learn = text_classifier_learner(data_clas, drop_mult=0.5)
learn.load_encoder('fine_tuned_enc')
learn.metrics=[accuracy, Fbeta_binary(beta2=1,clas = 0)]

OR if you want F1 for both label 1 and label 0

learn = text_classifier_learner(data_clas, drop_mult=0.5)
learn.load_encoder('fine_tuned_enc')
f1_label1 = Fbeta_binary(1,clas = 0)
f1_label0 = Fbeta_binary(1,clas = 1)
learn.metrics=[accuracy, f1_label1,f1_label0]

Here’s a notebook example.

I think there are lots of metrics such as these mentioned in this PR that forummers can help fastai build, but they most probably have to be written as callbacks


(Azarudeen) #24

It worked. Thank you so much :slight_smile: