Fastai v2 daily code walk-thrus

I see, thanks

More generally, fastai v2 tries to let you avoid using inheritance where possible, by allowing you to pass functions instead of overriding them (e.g. as we saw in DataLoader). This is both more friendly for new users (they don’t have to learn OO if they don’t need stateful behavior) and makes some code a bit simpler.


Just realized I’m doing a talk on Monday, so there will be no walk-thru.


Today’s walkthrough was absolutely great! I learnt more about python in the last 4 days than the last 3 years :-). But I have a few doubts after listening to today’s walkthrough. My understanding was that in a tuple if individual element of a tuple needs to be selectively transformed we need @TupleTransform whereas for a combination of image and label tuple we could do that using @Transform itself. So why is the @TupleTransform there and what is the difference between that and @Transform. Also what is the code in @Transform that allows the use of direct functions being applied like an encodes even though encodes is not defined in that function. I am talking about the norm function used to normalise the image. We used @Transform decorator to provide the subclassing to this function. But how is norm function triggered as encodes call here.

Great questions @pnvijay!

The answer to both of your questions lies in Pipeline. Pipeline has an as_item param, which will set as_item for all of its transforms. However, TupleTransform always uses as_item=False, even if it’s in a Pipeline with as_item=True. Have a look at the source code for TupleTransform and tell us what you find out… :slight_smile:

This is also how norm works when it’s just a function. When a plain function is in a pipeline, Pipeline converts it into a transform. The first param of the Transform constructor is a function to be used as encodes.

1 Like

I have been looking at BypassNewMeta and what it does is very simple while at the same time completely mind blowing!

class BypassNewMeta(type):
    "Metaclass: casts `x` to this class, initializing with `_new_meta` if available"
    def __call__(cls, x, *args, **kwargs):
        if hasattr(cls, '_new_meta'): x = cls._new_meta(x, *args, **kwargs)
        if cls!=x.__class__: x.__class__ = cls
        return x

The straightforward behavior is that when we define __call__ on a class we can use an instance of that class to call it like we would a function.


So far so good. But apparently a class being an instance of a metaclass exhibits same behavior:


One would think that Something() would create a new object of type Something and this indeed is the default behavior. But apparently you can override __call__ on the class (redefine it on the metaclass that the class is an instance of) and it can do something completely else rather than instantiating an object!


Thanks for the response! I understand the motivation behind the design now. Very elegant!

1 Like

Yes the metaclass system in Python is such a nice design! :slight_smile: It’s simple and elegant and powerful.


Hello Jeremy :slight_smile:
What will we be looking at today? It will help me if I look at the code before walk through

I believe Jeremy said he had a prior engagement today and we wouldn’t be having a code walkthrough today but somebody should definitely confirm that.

1 Like

I seem to recall he was giving a talk :slight_smile:

Yes, there is no code walk-thru today.


going through the first code walk-through noticed 07_vision_core, there seems to be some import statement differences between my recent git clone of fastai/fastai_dev (from local.torch_basics import *) is different from what is on the Jeremy’s video. Any reason?

1 Like

Its still under heavy development so expect changes everyday.

1 Like

Will there be a code walk-thru today?

Yes should be.


Thanks! The reason why I am asking is not to bother you but that I need to stay up late if there is (I’m from Germany).


Today’s live stream:

Walk-thru 5 video added:


A post was merged into an existing topic: Fastai v2 code walk-thru 5