Basic metrics: Is F1 score the same as Fbeta in fastai?

Hi! I would really like to clarify the following: sklearn documentation has a metric called f1_score. It seems to me that this metric is the same as fastai fbeta if the beta parameter is set to 1. This metric can be used both for binary (e.g. cats vs dogs) as well for multi-class classification problems (e.g. cats, dogs vs parrots) since fastai takes care in the background of the hot-enconding. However, for multi-label problems (e.g. satelite image which contain forest and sea), we should use instead MultilabelFbeta. Is what I am saying correct?

I believe what you are saying is correct. You should be able to test that it is correct with sklearn’s f1

1 Like

Interestingly, I get an error after simply initializing the class. Is this a fastai bug? Would you mind trying it @KevinB? I am not doing anything special yet.

I will look into it and see if I have an issue. I haven’t used MultiLabelFbeta so I’m not sure

I have used Fbeta once, it cannot be implemented in multi-class classification(single label) in Fastai. Because, the labels in fastai are just Label-encoded and are not one-hot encoded int that case. Where as in multi-label classification the labels are one-hot-encoded, where Fbeta function can be used. I am not sure why there is another function called Multi-label Fbeta is present.

@mgloria, could you do something like this:

F1 = partial(MultiLabelFbeta, beta=1, average="macro")

then try feeding that F1 into the metrics?

I’m using this as my starting point on this solution: https://medium.com/@hiromi_suenaga/deep-learning-2-part-1-lesson-4-2048a26d58aa

Let me know if that doesn’t work (actually let me know either way :slight_smile: )

2 Likes

MultiLabelFbeta is a LearnerCallback, you use it by filling in the arguments you want with partial like this: f1 = partial(MultiLabelFbeta, beta=1, average"macro") and then pass it to the learner like this: learn = cnn_learner(data, models.resnet34, ..., callback_fns=[f1]) if you use a cnn_learner.

1 Like

Does it work to pass it into metrics? I don’t have code to test this at the moment, but I assumed it would still work with metrics=[F1]

Hey @KevinB I had a second to try your proposal and so far no luck. If I understood it correctly you proposed the following:

TypeError                                 Traceback (most recent call last)

in
1 learn.fit_one_cycle(1, callbacks=[SaveModelCallback(learn, every=‘improvement’, name=‘test’),
----> 2 EarlyStoppingCallback(learn, min_delta=0.01)])

/opt/anaconda3/lib/python3.7/site-packages/fastai/train.py in fit_one_cycle(learn, cyc_len, max_lr, moms, div_factor, pct_start, final_div, wd, callbacks, tot_epochs, start_epoch)
20 callbacks.append(OneCycleScheduler(learn, max_lr, moms=moms, div_factor=div_factor, pct_start=pct_start,
21 final_div=final_div, tot_epochs=tot_epochs, start_epoch=start_epoch))
—> 22 learn.fit(cyc_len, max_lr, wd=wd, callbacks=callbacks)
23
24 def lr_find(learn:Learner, start_lr:Floats=1e-7, end_lr:Floats=10, num_it:int=100, stop_div:bool=True, wd:float=None):

/opt/anaconda3/lib/python3.7/site-packages/fastai/basic_train.py in fit(self, epochs, lr, wd, callbacks)
198 callbacks = [cb(self) for cb in self.callback_fns + listify(defaults.extra_callback_fns)] + listify(callbacks)
199 if defaults.extra_callbacks is not None: callbacks += defaults.extra_callbacks
–> 200 fit(epochs, self, metrics=self.metrics, callbacks=self.callbacks+callbacks)
201
202 def create_opt(self, lr:Floats, wd:Floats=0.)->None:

/opt/anaconda3/lib/python3.7/site-packages/fastai/basic_train.py in fit(epochs, learn, callbacks, metrics)
104 if not cb_handler.skip_validate and not learn.data.empty_val:
105 val_loss = validate(learn.model, learn.data.valid_dl, loss_func=learn.loss_func,
–> 106 cb_handler=cb_handler, pbar=pbar)
107 else: val_loss=None
108 if cb_handler.on_epoch_end(val_loss): break

/opt/anaconda3/lib/python3.7/site-packages/fastai/basic_train.py in validate(model, dl, loss_func, cb_handler, pbar, average, n_batch)
61 if not is_listy(yb): yb = [yb]
62 nums.append(first_el(yb).shape[0])
—> 63 if cb_handler and cb_handler.on_batch_end(val_losses[-1]): break
64 if n_batch and (len(nums)>=n_batch): break
65 nums = np.array(nums, dtype=np.float32)

/opt/anaconda3/lib/python3.7/site-packages/fastai/callback.py in on_batch_end(self, loss)
306 “Handle end of processing one batch with loss.”
307 self.state_dict[‘last_loss’] = loss
–> 308 self(‘batch_end’, call_mets = not self.state_dict[‘train’])
309 if self.state_dict[‘train’]:
310 self.state_dict[‘iteration’] += 1

/opt/anaconda3/lib/python3.7/site-packages/fastai/callback.py in call(self, cb_name, call_mets, **kwargs)
248 “Call through to all of the CallbakHandler functions.”
249 if call_mets:
–> 250 for met in self.metrics: self._call_and_update(met, cb_name, **kwargs)
251 for cb in self.callbacks: self._call_and_update(cb, cb_name, **kwargs)
252

/opt/anaconda3/lib/python3.7/site-packages/fastai/callback.py in _call_and_update(self, cb, cb_name, **kwargs)
239 def call_and_update(self, cb, cb_name, **kwargs)->None:
240 “Call cb_name on cb and update the inner state.”
–> 241 new = ifnone(getattr(cb, f’on
{cb_name}’)(**self.state_dict, **kwargs), dict())
242 for k,v in new.items():
243 if k not in self.state_dict:

/opt/anaconda3/lib/python3.7/site-packages/fastai/callback.py in on_batch_end(self, last_output, last_target, **kwargs)
342 if not is_listy(last_target): last_target=[last_target]
343 self.count += first_el(last_target).size(0)
–> 344 val = self.func(last_output, *last_target)
345 if self.world:
346 val = val.clone()

TypeError: init() got multiple values for argument ‘beta’

Hi @j.laute I tried it out but I get an error too:


You have to put it in callback_fns when creating the learner, ie learn = cnn_learner(<your other parameters.... >, callback_fns=[F1])

I tried it too @j.laute (appologies for not mentioning it before) but I get also an error:



Then the mistake is probably somewhere else, LearnerCallbacks are supposed to be used with callback_fns. If you can, put together a Google colab notebook that has the same error and I can take a look :slight_smile:

Weird… the model does train fine if F1 etc. is removed, I just wanted to basically learn how to add them because I think it is something quite handy. Dear @j.laute I just create the colab, you kind find it here with the error reproduced with a subset of the data. Let me know if you can edit and thanks again for looking into it!
Just in case above does not work: https://colab.research.google.com/drive/1arCpt9TcOwfFufLK9XitYJh98YGWqctX

@j.laute are you at EuroPython by any chance? I just thought with a bit of luck we may even coincide in person. :wink:

@mgloria unfortunately I’m not there :frowning: Enjoy it though!

I had the same issue and this worked for fastai v 1.60:
F1 = MultiLabelFbeta(beta=2, average="macro")
learn = cnn_learner(data, arch, metrics=F1)

@mgloria and @j.laute did you get how to use MultiLabelFbeta?

One thing I found from documentation, is that, MultiLabelFbeta is a class and needs to be instantiated before use.

So, we can’t use partial with it.

Hope you can share details about it.

Is this issue resolved