Will there be a code walk-thru today?
Yes should be.
Thanks! The reason why I am asking is not to bother you but that I need to stay up late if there is (I’m from Germany).
I figured as to how @Transform
works out for a function defined with this decorator. Let us take the example of the normalisation function that was used in code walkthrough 4.
def _norm(x,m,s): return (x-m)/s
norm_t = Transform(_norm)
I did a %%debug
on norm_t = Transform(_norm)
. I found that the first thing the code was doing was calling __call__
of _TfmMeta
as Transform
uses the meta class _TfmMeta
.
def __call__(cls, *args, **kwargs):
f = args[0] if args else None
n = getattr(f,'__name__',None)
if not hasattr(cls,'encodes'): cls.encodes=TypeDispatch()
if not hasattr(cls,'decodes'): cls.decodes=TypeDispatch()
if isinstance(f,Callable) and n in ('decodes','encodes','_'):
getattr(cls,'encodes' if n=='_' else n).add(f)
return f
return super().__call__(*args, **kwargs)
Here the cls is the Transform
class and the function _norm is in args. f becomes _norm as args[0] is _norm. There are no encodes and decodes set here and neither is n in _ or encodes or decodes. So the last line of the function super().call(*args, **kwargs) gets called.
I was surprised to see that super is Transform class and call is calling init of Transform. But here on the behaviour was predictable. If you follow the debugger snapshot pasted below it will be clear that the flow of events is as follows:
-
enc is set to _norm.
-
init_enc is set to enc (which is now _norm)
-
The existing encodes and decodes of the Transform are deleted
-
encodes and decodes now become TypeDispatch()
-
The enc is added to encodes. Therefore _norm is now a part of self.encodes which allows to use _norm as an encodes function call.
-
The final output is
Transform: False {'object': '_norm'} {}
ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(140)init()
138 filt,init_enc,as_item_force,as_item,order = None,False,None,True,0
139 def init(self, enc=None, dec=None, filt=None, as_item=False):
–> 140 self.filt,self.as_item = ifnone(filt, self.filt),as_item
141 self.init_enc = enc or dec
142 if not self.init_enc: returnipdb> enc
<function _norm at 0x12ceef830>
ipdb> dec
ipdb> filt
ipdb> as_item
False
ipdb> init_enc
*** NameError: name ‘init_enc’ is not defined
ipdb> n/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(141)init()
139 def init(self, enc=None, dec=None, filt=None, as_item=False):
140 self.filt,self.as_item = ifnone(filt, self.filt),as_item
–> 141 self.init_enc = enc or dec
142 if not self.init_enc: return
143ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(142)init()
140 self.filt,self.as_item = ifnone(filt, self.filt),as_item
141 self.init_enc = enc or dec
–> 142 if not self.init_enc: return
143
144 # Passing enc/dec, so need to remove (base) class level enc/decipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(145)init()
143
144 # Passing enc/dec, so need to remove (base) class level enc/dec
–> 145 del(self.class.encodes,self.class.decodes)
146 self.encodes,self.decodes = (TypeDispatch(),TypeDispatch())
147 if enc:ipdb> self.class
<class ‘local.data.transform.Transform’>
ipdb> self.class.encodes
{}
ipdb> n/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(146)init()
144 # Passing enc/dec, so need to remove (base) class level enc/dec
145 del(self.class.encodes,self.class.decodes)
–> 146 self.encodes,self.decodes = (TypeDispatch(),TypeDispatch())
147 if enc:
148 self.encodes.add(enc)ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(147)init()
145 del(self.class.encodes,self.class.decodes)
146 self.encodes,self.decodes = (TypeDispatch(),TypeDispatch())
–> 147 if enc:
148 self.encodes.add(enc)
149 self.order = getattr(self.encodes,‘order’,self.order)ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(148)init()
146 self.encodes,self.decodes = (TypeDispatch(),TypeDispatch())
147 if enc:
–> 148 self.encodes.add(enc)
149 self.order = getattr(self.encodes,‘order’,self.order)
150 if dec: self.decodes.add(dec)ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(149)init()
147 if enc:
148 self.encodes.add(enc)
–> 149 self.order = getattr(self.encodes,‘order’,self.order)
150 if dec: self.decodes.add(dec)
151ipdb> n
/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(150)init()
148 self.encodes.add(enc)
149 self.order = getattr(self.encodes,‘order’,self.order)
–> 150 if dec: self.decodes.add(dec)
151
152 @propertyipdb> n
–Return–
None/Users/i077725/Documents/GitHub/fastai_dev/dev/local/data/transform.py(150)init()
148 self.encodes.add(enc)
149 self.order = getattr(self.encodes,‘order’,self.order)
–> 150 if dec: self.decodes.add(dec)
151
152 @propertyipdb> n
–Return–
Transform: Fa…’: ‘_norm’} {}
I am surprised as to how the super().call() delegates to init() of Transform and how the super() of a call inside _TfmMeta refers to Transform class. Any thoughts?
@jeremy and others, Is there sort of a playlist on youtube for these walkthroughs which I can watch in sequence. I missed a couple and want to catch up on them. Thanks in advance (and thanks so much for doing these! )
Hi, all the youtube links are right here in this same thread/forum. Please scroll up and you can see them for all the walkthroughs.
Not a playlist but they’re all on youtube:
@pnvijay @sairam6087 and anyone else wanting a playlist:
https://www.youtube.com/playlist?list=PLZqTTicEKsnG_DvtXKGp8iWHfECmivi7U
Jeremy’s playlist is on his channel
Note that it’s not a subclass of _TfmMeta
- but it uses _TfmMeta
as its metaclass. Not the same thing!
I’ve added a playlist in my YouTube account to keep things simpler - although as mentioned the list is in the top post of this thread.
Read the Python Data Model docs about metaclasses, and see what it says about this. Let us know what you learn!
Thank Jeremy! When I have time I plan on hitting “autoplay” to catch up
Thanks Jeremy! Have made the correction in my post.
I have to head to NY tomorrow and get ready this afternoon, so there won’t be any code walk-thrus until Monday. Good chance for everyone to catch up!
I’m still trying to wrap my head around this and how to use it properly. I’m designing an AudioSpectrogram
class that subclasses TensorImageBase
(is this right or should I be subclassing TensorImage
)? Either way they both inherit from, and are of type BypassNewMeta
. My questions are…
- Why exactly is the inheritance hierarchy structured this way, what are the benefits of
BypassNewMeta
? - How do I override
__init__()
in myAudioSpectrogram
class that extendsTensorImageBase
given that that inherits fromBypassNewMeta
?. I’ve tried button clicking to try to get things to do what I want, the closest I’ve come is to add the line_new_meta = __init__
which does call my init method, but ends with the errorTypeError: __class__ assignment only supported for heap types or ModuleType subclasses
.
With my limited understanding (please correct me if I say something off), I think that is where we want to use create
method.
Say you have something like:
a = AudioSpectrogram(...)
The behavior gets altered by BypassNewMeta.__call__()
. So it doesn’t go to AudioSpectrogram.__init__()
in a way it usually does without the metaclass.
Hence if you want to create an instance of AudioSpectrogram, you want to do:
a = AudioSpectrogram.create(...)
I am not really sure what the answers to these questions are. I only started to learn the library and it seems you are referring to how things fit on a higher level, what the architecture is.
It seems it allows us to pass in something, say a tensor, perform some initialization to it and swap its class without allocating a new object. Just looking at the code right now. So if we have some Tensor, we can add additional behavior to it without having to recreate a new object, that’s what it seems like from looking at the code, but I have not experimented with it so might be completely wrong.
I think it’s best to not think about classes with the BypassNewMeta
type as regular classes… meaning the whole notion of __new__
and __init__
can be foregone. We are not creating a new instance of cls
(which is the point of __new__
) and we are not configuring the newly created instance using __init__
, since there is no newly created instance .
It’s probably best to think about classes of these types as taking some object, running some configuration steps on it (whatever is defined in _new_meta
) and than ‘casting’ that object to the new class by swapping out the __class__
reference.
It is probably important that TensorBase
inherits from Tensor
. So if your AudioSpectogram
is a Tensor it probably could inherit from one of the Tensor classes (not sure which one would work best for your specific use case) and you can endow it with additional behavior via defining methods on the AudioSpectogram
class…
Well, sorry, just trying to be helfpul but not sure if this is accurate. Hopefully in a couple of days, possibly weeks, I will be able to say more