How to callbacks return things?

Here is the standard form of a callback handler that loops through its callbacks:

class CallbackHandler():
     def __init__(self,cbs=None):
            self.cbs = cbs if cbs else []

 def begin_fit(self, learn):
        self.learn,self.in_train = learn,True
    learn.stop = False
    res = True
    for cb in self.cbs: res = res and cb.begin_fit(learn)
    return res

def after_fit(self):
    res = not self.in_train
    for cb in self.cbs: res = res and cb.after_fit()
    return res

def begin_epoch(self, epoch):
    learn.model.train()
    self.in_train=True
    res = True
    for cb in self.cbs: res = res and cb.begin_epoch(epoch)
    return res

def begin_validate(self):
    self.learn.model.eval()
    self.in_train=False
    res = True
    for cb in self.cbs: res = res and cb.begin_validate()
    return res

We see that at each point, it is able to only return ‘res’, a boolean. But if a callback returns a object it will throw an error as it expects a boolean for the AND operation with res. However in cases as

where we do
xb,xy=cb_handler.on_batch_begin(xb,yb)
and
out = cb_handler.on_loss_begin(out)
and
loss, skip_backward = callbacks.on_loss_begin(loss)

we are returning objects given by the callbacks, two in the last case. How are we able to do this?

sounds like you are showing code of 2 different versions, as on_batch_begin method does not even exist in CallbackHandler

I’d recommend reading up on this: https://docs.fast.ai/callback it’s a pretty thorough guide. The short answer is each of the activations are done in certain instances, tracing it back as a step-wise process. Eg, one fits into the next before or after something happens and passing it one moment to the next. That’s how and why we can return things, as the callbacks EXPECT values to be returned.

One example is the mixup callback, seen here: https://github.com/fastai/fastai/blob/master/fastai/callbacks/mixup.py#L6

on_batch_begin expects that when it is done, you want to return x’s and y’s before you pass your batch through the model.

Does this help?