The metaclass gets created at the moment when its type is passed to the class constructor (not when the object is instantiated).
The interesting thing is that by default __init__ of parent classes never gets called, only the __init__ of the lowest class in the hierarchy!
It works the same for classes without a metaclass explicitly defined (they probably use type by default)
In some sense, this is probably unsurprising. That is why in Pytorch when we were inheriting from nn.Module we had to do the following:
class Model(nn.Module):
def __init__(self):
super().__init__()
It is quite nice to peak behind the curtain though
More interestingly, when we call super() the current instance is passed on to __init__ higher level in the hierarchy! This is contrary to the idea I held in my mind before, that with a class hierarchy first Parent.__new__ gets called, it passes the instance to its Parent.__init__ and then we repeat the process for the lower class (this wouldn’t make sense now that I think about it).
Even though a __new__ and an __init__ should live on the same class, they both are two separate streams of actions! First we follow the hierarchy for creating an instance (__new__) and afterwards we follow the hierarchy for instantiating the instance (__init__) that we got in the earlier step.
I was trying to create DataLoader for Image to Image model. Now both input and output are separate images which I would prefer to see side by side. Should I use matplotlib to plot images or is there a easier way in the library.
These walkthroughs are a very good idea to understand the library!
Question about today’s walkthrough (it is the first I attended so I am not sure if this was explained previously): what is function dispatch in Python?
Someone correct me if I’m wrong: Regarding the Transform class, we tag our functions with the @Transform decorator to allow our functions to be compatible with each other, via flowing data through the encode/decode in a pipeline.
Optionally, we can inherit from the Transform class to give specific functions some state (to use when encoding/decoding, such as storing the mean and std for Normalization).
This is the same way I understood it. Of course, no one stops you from using the class even if you don’t have a state, but that would probably not be the best way to do it.
More generally, fastai v2 tries to let you avoid using inheritance where possible, by allowing you to pass functions instead of overriding them (e.g. as we saw in DataLoader). This is both more friendly for new users (they don’t have to learn OO if they don’t need stateful behavior) and makes some code a bit simpler.
Today’s walkthrough was absolutely great! I learnt more about python in the last 4 days than the last 3 years :-). But I have a few doubts after listening to today’s walkthrough. My understanding was that in a tuple if individual element of a tuple needs to be selectively transformed we need @TupleTransform whereas for a combination of image and label tuple we could do that using @Transform itself. So why is the @TupleTransform there and what is the difference between that and @Transform. Also what is the code in @Transform that allows the use of direct functions being applied like an encodes even though encodes is not defined in that function. I am talking about the norm function used to normalise the image. We used @Transform decorator to provide the subclassing to this function. But how is norm function triggered as encodes call here.
The answer to both of your questions lies in Pipeline. Pipeline has an as_item param, which will set as_item for all of its transforms. However, TupleTransformalways uses as_item=False, even if it’s in a Pipeline with as_item=True. Have a look at the source code for TupleTransform and tell us what you find out…
This is also how norm works when it’s just a function. When a plain function is in a pipeline, Pipeline converts it into a transform. The first param of the Transform constructor is a function to be used as encodes.
I have been looking at BypassNewMeta and what it does is very simple while at the same time completely mind blowing!
class BypassNewMeta(type):
"Metaclass: casts `x` to this class, initializing with `_new_meta` if available"
def __call__(cls, x, *args, **kwargs):
if hasattr(cls, '_new_meta'): x = cls._new_meta(x, *args, **kwargs)
if cls!=x.__class__: x.__class__ = cls
return x
The straightforward behavior is that when we define __call__ on a class we can use an instance of that class to call it like we would a function.
So far so good. But apparently a class being an instance of a metaclass exhibits same behavior:
One would think that Something() would create a new object of type Something and this indeed is the default behavior. But apparently you can override __call__ on the class (redefine it on the metaclass that the class is an instance of) and it can do something completely else rather than instantiating an object!