Fastai v2 daily code walk-thrus

How Transforms make use of TypeDispatch

Okay, here’s another one! I couldn’t have imagined that I will ever understand this part of V2, but now that I do, it just seems surreal! This is Python at a next level! And when you come to think of it, you can understand why it’s built this way.

But, lets discuss the thought process a little later.

First let’s understand encodes and decodes inside Transform!

So, from _TfmDict

class _TfmDict(dict):
    def __setitem__(self,k,v):
        if k=='_': k='encodes'
        if k not in ('encodes','decodes') or not isinstance(v,Callable): return super().__setitem__(k,v)
        if k not in self: super().__setitem__(k,TypeDispatch())
        res = self[k]
        res.add(v)

As long as something is not of type encodes or decodes the namespace of the cls would be created using dict as per normal behavior. Note, that __setitem__ is responsible for setting k:v inside dict, thus if you update that, you can get custom behavior!

So as long as something is not encodes or decodes, just use dict to set k:v.

BUT, when it is encodes or decodes then k:TypeDispatch()

And as we know - TypeDispatch is nothing but a cool dict of type:function mapping!

So theoretically speaking, the namespace of this special class which is a subclass of TfmMeta will look something like

{....all the usual stuff like __module__:__main__etc AND 
encodes: 
    {
     bool: some_func1,
     int: some_func2, 
     Numbers.Integral: some_func3 
    }, 
decodes: 
    {
     bool: some_reverse_func1,
     int: some_reverse_func2, 
     Numbers.Integral: some_reverse_func3 
    }, 

And finally ! When you call encodes or decodes - it can be done so for different types, which will be called using __call__ inside TypeDispatch which then call the specific corresponding function to type!

It is all making sense now.

Please correct me if I have understood anything wrong :slight_smile:

1 Like

The actual behavior, as you noticed, is that it gets the annotation of the first annotated param of f. However, we don’t rely on that behavior in the library, and it’s not guaranteed to always do that in the future, so we don’t document that quirk in the docstring.

2 Likes

@arora_aman these are really great analyses! It would be great it you could also copy them into here, if you have a chance:

1 Like

There will be a walk-thru today at 2.30p. Details in the top post.

2 Likes

Thank you, Jeremy!

Done :slight_smile:

1 Like

A post was merged into an existing topic: Fastai v2 code walk-thru 6

Added the walk-thru 6 video: https://youtu.be/8i9bo5wLSE4

2 Likes

This is the most amazing way to motivate the community to help with v2. Thank you!

It would be awesome to have a quick example on how we could merge different types of data into one DataSource. For example Image, Text and Tabular. I watched all videos. Having worked on https://github.com/EtienneT/fastai-petfinder to do that with v1 was a lot of work, can’t wait to adapt it to v2.

I think we are at a point where you showed all the building blocks where we could do this but it would be nice to have a quick high level code example if you can.

4 Likes

Yes we certainly plan to do this - although we haven’t started on it yet.

2 Likes

There won’t be a walk-thru today - we have a sick nanny so I’ll be with my daughter.

7 Likes

Today’s stream: https://youtu.be/H_7bcfaLrdI

1 Like

Are we still live?

image

Not sure if we’re done for today or if Jeremy is getting it back up and running, but that’s his message on youtube.

2 Likes

Thank you @KevinB!
I missed the message

Yup we’re done!

4 Likes

Walk-thru video 7 posted ( https://www.youtube.com/watch?v=H_7bcfaLrdI )

2 Likes

There won’t be a walk-thru today, but there will be one tomorrow

5 Likes

Today’s stream will be at https://youtu.be/yL1un5SH63k

(Also I always put the daily stream link in the top post, in case anyone hadn’t noticed! :slight_smile: )

Perhaps also post the link on fastai home page and pin to your twitter account?

Nah I don’t want to oversell it… :slight_smile:

6 Likes