Common decorators in fastai2 source code

Fastai2 source code was very scary to me(still is) when I started reading it. I kept on seeing some decorators everywhere and wondered what they were. I decided to make explanation of some of the decorators which you’ll see scattered everywhere in the fastai2 source code

  • @patch: This is for adding extra functionalities to any given python Type. For example, let’s assume you want an L type to be given a new method mult, here’s how you will go about it.
@patch

def mult(x:L, a): 

    '''we want an L object to have the method `mult`, we just have to patch 

    it to the L

    NB: @patch does not work on default inbuilt python types

    '''

    # a is the value we pass into the func when defining it

    #x is the value of the initial type which we want to operate on

    return x*a

L(1, 2).mult(2)

[OUTPUT] (#4) [1,2,1,2]

NB: When @patch is to be used on multiple Types, they are passed in as tuples and the new method we write will be patched to the two Types in the tuple

  • @use_kwargs_dict: It is used to set the **kwargs which are to be used in a method call. Think of it as a store for al the **kwargs in a functions signature. An example is shown below

@use_kwargs_dict(a=10, b=12, c=None)

def rand_func(z=10, **kwargs):

    pass

inspect.signature(rand_func)

[OUTPUT] <Signature (z=10, *, a=10, b=12, c=None)>

NB: @use_kwargs can be used instead of @use_kwargs_dict when we want all the kwargs to have a None values. Also note that a list of all the argument keys is passed into the @use_kwargs instead of a key-value pair in @use_kwargs_dict

  • @func_kwargs: It is used to change the class methods in a class which are in the _methods class attribute list. A demo can be seen in the example below

@funcs_kwargs

class A():

    _methods = ['method1']

    def __init__(self, a=10, **kwargs): pass 

    def method1(self): return print('From method 1')

    def method2(self): return print('From method 2')

a_ = A()

a_.method1()

[OUTPUT] From method 1

#we can change any method in the class which is in the `_method` class attribute

a_ = A(method1 = lambda: print(f'New method 1'))

a_.method1()

[OUTPUT] New method 1

#As we can see, the class method method1 has changed. This change can only occur

#  if the class method is in the _methods class attribute list

We can optionally use a @method decorator to define a new class method to replace one that exists in the _method class attribut list


@method

def new(self):

    print(f'New method 1')

a_ = A(method1=new)

a_.method1()

[OUTPUT] New method 1

  • @delegates: It is used to pass the keyword args of a function to it’s wrapper by simply calling **kwargs in the __init__ of the initial function. @delegates makes the autocompletion(shift+tab) of the __init__ of the wrapper function show the necessary keyword args of the initial function. An example of this is shown below

def init_func(a=1, b=2):

    return a + b

@delegates(init_func)

def wrap_func(c=10, **kwargs):

    pass

we can now inspect the function signature of this wrapper function

inspect.signature(wrap_func)

[OUTPUT] <Signature (c=10, a=1, b=2)> 

#from the result, we can see that it gives collates the kwargs from the initial 

# function and adds them to the signature of the wrapper function.

  • @typedispatch is the magic that makes all the transforms in fastai work (especially the decodes). It is the heart of fastai2 because every operation in this new library is treated as a transform and the @typedispatch handles them all. You use it to specify the types that every given transform is allowed to operate on. Let’s say you have a Pipeline containing Pipeline(PILImage.create, ToTensor) which you want to use on either a TfmdLists or a Datasets. Type dispatch makes it possible to know the various types in the TfdLists or Datasets which each transform will be able to work on. PILImage.create can only be dispatched on PIL.Image types. ToTensor on torch.Tensor types.

# lets write a generic function that can do concatenation of its strings arguments.

# We may want to modify this function to be able too perform a different operation 

# it's arguments are no longer strings. We can do this using a @typedispatch decorator

@typedispatch

def generic_func(a:str, b:str):

    return a + b

@typedispatch

def generic_func(a:int, b:int):

    '''function that does multiplication when we have int types''' 

    return a * b

@typedispatch

def generic_func(a:int, b:float):

    '''function that does division when we have an int and float types''' 

    return a / b

@typedispatch

def generic_func(a:int, b:str):

    '''function that does string multiplication when we have an int and str types''' 

    return a * b

# str, str type

generic_func('Fastai is', ' awesome!')

[OUTPUT] 'Fastai is awesome!'

#int, int types

generic_func(2, 3)

[OUTPUT] 6

#int, float types

generic_func(2, 3.0)

[OUTPUT] 0.6666666666666666

#int, str types

generic_func(2, 'fastai')

[OUtPUT] 'fastaifastai'

# A single function performing multiple operations

Thanks to @msivanes , @kshitijpatil09 and everyone in my fastai study group for pushing and helping me to do this

CORRECTIONS AND CRITICISM IS HIGHLY WELCOME

29 Likes

Thanks so much for compiling these explanations!

IMO, this list needs to be included prominently in the official fastai2 docs. It’s key to understanding how the fastai source code actually operates - the what, how, and why.

With appreciation, Malcolm

2 Likes

This is placed in the fastcore documentation, as it’s from that library not fastai2. fastcore holds the foundational code

Type Dispatch: http://fastcore.fast.ai/dispatch

Delegates: http://fastcore.fast.ai/foundation#delegates

Method: http://fastcore.fast.ai/foundation#method

and the rest too: http://fastcore.fast.ai/foundation#Foundational-functions

Though wonderful summary, great job :slight_smile:

3 Likes

Glad it helped

Yes!! everything there was gotten from the docs. Thanks @muellerzr

1 Like

Thanks @muellerzr. I did not know that those docs already existed. Yet I still want to risk some gentle dissent, even knowing that questions of documentation usability fall into matters of opinion.

Tendo’s work offers something that the fastai docs do not - the reasons for creating such constructions, how they are used, and examples. Phrases like “Decorator: add f to cls” and “Decorator: replace **kwargs in signature with params from to” are technically correct documentation, but they make my brain freeze up. There’s no explanation of what problem the idiom is solving, how it solves solves the problem, and when and how to use it. The tests in the notebook also do not serve as examples of its use in a context; they only verify that the functions work as intended.

Speaking only for me, the above work lets me grasp the motivation, meaning, and context of these brilliant inventions. It helps me understand the structure and flow of the fastai source code, perhaps even contribute to it with competence. I think that including something like Tendo’s work would make fastai more approachable, usable, and maintainable.

Maybe a separate tutorial would be a way to include context and usage examples. That’s something I would help with.

4 Likes

I do agree with all your sentiment :slight_smile: I think some community help with expanding the fastcore docs may be a good idea, would you agree @sgugger? I presume it’s a ‘to-do’ thing similar to the v2 docs? :slight_smile:

I can definitely help with this. I also agree with @Pomo. It took me a while to grasp what was going on and be able to write this. Jeremy’s code walkthroughs are also a good resource for building an intuition about most of these things if that helps

Expanding docs is always a good idea.

1 Like

Tendo,

As a start, would you be willing to put your comments and examples into a notebook and publish or PM to me? I’ll be a naive test case, and return edits and questions for clarity. If I can understand, probably most readers will be able to. :wink:

1 Like

One suggestion I’d offer you is to join my remote study group here. Some of us there post our work on there for reviews before they get to the forums

@Tendo great work compiling the use cases of decorators. I tried out the @typedispatch but found weird results, can you or anyone explain these behaviours?

@typedispatch
def generic_func(a:str, b:str):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I recieved {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func("A","B"))

OUT:
I received A:<class 'str'> and B:<class 'str'>
AB
***********************************************
Replaced a:str with just a
***********************************************
@typedispatch
def generic_func(a, b:str):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I recieved {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func("A","B"))

OUT:
I received A:<class 'str'> and B:<class 'str'>
AB

***********************************************
just a as before but passing an int
***********************************************
@typedispatch
def generic_func(a, b:str):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I received {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func(1,"B"))

OUT:
1

***********************************************
just a as before but passing a float
***********************************************
@typedispatch
def generic_func(a, b:str):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I received {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func(1.5,"B"))

OUT:
1.5

Why the output is just 1 and 1.5 for the last two codes? it didn’t even execute the function and returned just a? Why this behaviour when int or float is passed for a but works fine for str?
PS: I ran this as script and general_func was modified and run each time to get this output, so states are not shared.

I think this has to do with the fact that typedispatch only dispatches to types that specified in the function definition. If the argument type in the function call doesn’t exist in the function definition then that argument is bounced back ie it doesn’t pass through the function. Think of typedispatch as a gateway that allows only types specified in the function definition to pass through the function.

I have noticed that if no type is specified during the function definition, then the function allows all types to pass through it hence disabling typedispatch. Hope this helps

1 Like

When I tried the experiment with a:str but left b with no type and it worked when passed int or float or str passed for b.

@typedispatch
def generic_func(a:str, b):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I received {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func("A",1.5))

OUT:
I received A:<class 'str'> and 1.5:<class 'float'>
A1.5

If b’s type is unspecified it works but if a’s type unspecified it just spits out a. So I’m still confused when the type is handled and when it is not handled.

More interestingly if I print the function itself:

@typedispatch
def generic_func(a, b:str):
    '''function that does string multiplication when we have an int and str types'''
    print(f"I received {a}:{type(a)} and {b}:{type(b)}")
    return str(a)+str(b)

print(generic_func(1.5,"B"))
print(generic_func)

OUT:
1.5
(str,object) -> generic_func

I don’t know why it is handled as (str,object)

I am not sure why this behavior occurs. I wonder if @muellerzr could help sort this out when he has the time.

1 Like

I guess generic dispatch suppresses the more specific one

@typedispatch
def generic_fn(a:str,b:str):
  print("Called 1st")
  return a+b

def generic_fn(a,b:str):
  print("Called 2nd")
  return str(a)+b

print(generic_fn("A","B"))
print(generic_fn(1,"2"))
print(generic_fn(1.5,"2"))
print(generic_fn(True,"2"))

Output:

Called 2nd
AB
Called 2nd
12
Called 2nd
1.52
Called 2nd
True2
1 Like

@sgugger can you please explain this behaviour? I’m confused how typedispatch behaves, so far I don’t think anyone here have exact answers.

In the first example, I don’t understand what you find weird. If you didn’t specify any default behavior yourself, fastcore just uses noop which returns its first argument. You can change this by implementing

@typedispatch
def generic_func(a, b):

For your second example, typedispatch does not work if you don’t specify anything for a. Type a: object to get it to work.

3 Likes

Why can’t we separate out default args using use_kwargs_dict?

hm_args = dict(merge=False,figsize=(5,5),alpha=0.6,interpolation='bicubic',cmap='magma')

@use_kwargs_dict(**hm_args)
def show_heatmap(x,cam,sz,**kwargs):
  "Used to show CAM/gradcam"
  print(kwargs)

#output: {}

If I inspect the signature of this function:

<Signature (x, cam, sz, *, merge=False, figsize=(5, 5), alpha=0.6, interpolation='bicubic', cmap='magma')>

So this does show up in the auto-completion, but why can’t I use the default values if haven’t specified in the function call?

@sgugger should I consider this is not supported by fastcore and use_kwargs_dict was not meant to be used this way ?