Context_manager "feature" mentioned in nbs/09_learner

Hi all,

I saw the comment in notebook 09_learner about how “contextlib.context_manager has a surprising “feature” which doesn’t let us raise an exception before the yield” and how _CbCtxInner should fix this, so I tried to test it by impementing a simple CB that skips a particular batch, for example:

class SkipCB(Callback):
    order = Callback.order - 1
    def __init__(self, n): fc.store_attr()
    def before_batch(self, learner): if learner.iter == self.n: raise CancelBatchException()

From my testing this CB doesn´t work, because the CancelBatchException() is never handled.
From what I understand after some debugging, what happens is that the CancelBatchException() is raised in the __enter__ method from self.cb_ctx('batch'), but it is first catched only by the __exit__ method from self.cb_ctx('epoch') and later by the __exit__ method from self.cb_ctx('fit'). The problem is the line return exc_type==chk_exc in these methods always returns False, since CancelBatchException() != CancelEpochException() != CancelFitException(), therefore the exception is never handled.

Does anyone else see the same behavior?

1 Like

From this link, it seems it´s really not possible to skip the with body directly from __enter__.

So in order to fix that, I implemented some changes to _CbCtxInner:

class _CbCtxInner(AbstractContextManager):
    def __init__(self, outer, nm): 
        self.outer, self.nm = outer, nm
        self.skip = False
        self.chk_exc = globals()[f'Cancel{self.nm.title()}Exception']
        except self.chk_exc: self.skip = True
        if self.skip: self.cleanup()
    def __exit__ (self, exc_type, exc_val, traceback):
            if not exc_type: self.outer.callback(f'after_{self.nm}')
            return exc_type==self.chk_exc
        except self.chk_exc: pass
        finally: self.cleanup()
    def cleanup(self):

And to Learner:

    def cb_ctx(self, nm):
        cm = _CbCtxInner(self, nm)
        if not cm.skip:
            with cm:
                loop = getattr(self, f'{nm}_loop')

    def batch_loop(self):
        ### predict, get_loss, update ###

    def epoch_loop(self):
        ### for batch in dl loop ###

    def fit_loop(self):
        ### for epoch in epochs loop

This makes CB such as SkipCB possible and shouldn´t break compatibility (not tested) with anything else from miniai.

If anyone would like to properly test it and help me submit a PR to the course repo, feel free to get in touch. :slight_smile:

Thanks for digging into this @pereira.lucas! Perhaps we should revert to using decorators instead of a context manager? Might that help avoid this complexity?

I’m not completely familiar with decorators and context managers to say with certainty that decorators are a better (easier to read and/or more efficient) route, but this is a solution I found with decorators after your suggestion, Jeremy:

def with_cb(nm):
    def cb_decorator(func):
        def cb_wrapper(learner):
            except globals()[f'Cancel{nm.title()}Exception']: pass
            finally: learner.callback(f'cleanup_{nm}')
        return cb_wrapper
    return cb_decorator

We would still need to implement the [...]_loop functions to be decorated, but it does seem at least more consise then the solution with context managers.

I don´t remember seeing a CB implementation in the course22p2 using decorators, when you say “revert” are you referring to something in the course I missed or to a previous implementation in fastai?

There was a previous implementation in the repo, but I suspect I switched to context managers before the first lesson.

The version I had looked much like what you have in your reply.

Cool! After some digging I found the first commit with the decorator implementation!

1 Like

Jeremy, are you accepting PRs in the course repo or is it just for the instructors?
I ask because I never contributed to an opensource project so it would be a great exercise for me, but as a professor I can totally understand if the repo is just for the instructors.

But if you are accepting PRs, do you think this change back to decorators would be an ok PR or do you prefer to just checkout earlier versions of the Learner implementation?

I am indeed accepting PRs, thank you for the offer! :smiley: Although in this case, I’d rather do some experiments myself to see what I can come up with, rather than merging a PR.

1 Like