SOURCE CODE: Mid-Level API

Ohhhh I just missed it … I thought it was PM … Will you guys upload the recording ?

2 Likes

So we’re going to have another session at 7:30 AM IST on April 5 too. This one will be about the Optimizers and the Captum callback.

Today’s session wasn’t very productive actually. We’ll tell you what we did in the next call?

1 Like

perfect! Seee you soon :slight_smile:

edit: Removed links due to a bug in zoom that keeps recording anytime someone joins
Zoom Link for today (Starts at Sunday, 7:30 AM IST()
Meeting Password: Removed

Edit: Please @ me if you have issues with joining.

2 Likes

Count me in as well. I’ve been obsessed about going into the internal details of the fastai library. I’ve just started watching your videos. I’m on the first one . When do you have your weekly meetups. I would love to join. @arora_aman

1 Like

Hii @init_27 when will the link to the recording be posted?

I’m reading the Siamese Tutorial and found the piece of code below:

class SiameseImage(Tuple):
    def show(self, ctx=None, **kwargs): 
        img1,img2,same_breed = self
        if not isinstance(img1, Tensor):
            if img2.size != img1.size: img2 = img2.resize(img1.size)
            t1,t2 = tensor(img1),tensor(img2)
            t1,t2 = t1.permute(2,0,1),t2.permute(2,0,1)
        else: t1,t2 = img1,img2
        line = t1.new_zeros(t1.shape[0], t1.shape[1], 10)
        return show_image(torch.cat([t1,line,t2], dim=2), title=same_breed, ctx=ctx, **kwargs)

I’m curious why we need to return a show_image here that I think even without it, the picture is always shown. Maybe we need to conserve the ax for another purpose like show_batch or sth. Thanks

One reason is that your approach will work only on Jupiter notebook and not in other traditional IDEs I guess. And the title and other fields are ignored if we just return the concatenated tensor…

1 Like

In DataBlock why we don’t put the args get_items, splitter, get_y, get_x in init .

I mean when I Shift-Tab to see the signature of DataBlock I can’t find the args above even though they seems mandatory for me.

I then looked at the source code and find with @funcs_kwargs decorator, all the _method attributes will be set with value from **kwargs. And _methods = ‘get_items splitter get_y get_x’.split()

So now I understand how it works but why it is made so complex ? What is the intent behind ? Thanks

These are added as **kwargs. You might be interested in this blog post from Jeremy

2 Likes

Thanks @nareshr8. But It might different of what I mean. Sorry if my question is not clear.

The piece of code I’m asking is below:

@funcs_kwargs
class DataBlock():
    "Generic container to quickly build `Datasets` and `DataLoaders`"
    get_x=get_items=splitter=get_y = None
    blocks,dl_type = (TransformBlock,TransformBlock),TfmdDL
    _methods = 'get_items splitter get_y get_x'.split()
    _msg = "If you wanted to compose several transforms in your getter don't forget to wrap them in a `Pipeline`."
    def __init__(self, blocks=None, dl_type=None, getters=None, n_inp=None, item_tfms=None, batch_tfms=None, **kwargs):

It’s decorated by @funcs_kwargs and not @delegates. @funcs_kwargs will set the cls attributes which are defined in _methods (‘get_items splitter get_y get_x’.split()) by the value in **kwargs .

By this way we can set the attributes of get_x,get_items,splitter,get_y

However, I have to look at the source code or an example to see how to initialize the DataBlock (because it is not autocomplete all the mandatory args)

Why don’t we just put

def init(self, blocks=None, dl_type=None, getters=None, n_inp=None, item_tfms=None, batch_tfms=None, get_x=None,get_items=None,splitter=None,get_y=None, **kwargs)

What is the intent behind the @funcs_kwargs decorators ?

Thanks

1 Like

It’s just ease of writing. You may have to declare those arguments in init and you may have to do self.{}={}… func_kwargs reduces these 2 steps to one. You just declare the _methods which contains list of things. This will be added to init function and a instance of the method is saved in the appropriate variable for later use.

1 Like

Great, it can be a very good reason. Thanks a lot. (However we have a drawback about autocomplete)

Are you guys meeting?

Not to my knowledge, no.

1 Like

Thanks @init_27 enjoy the weekend then. Best regards

Np! You too! :tea:

1 Like

Hi all – sorry, I ended up being a bit out of the loop with this in the end.

Something I’m trying to work out: say you had a NN written fully in numpy, but still wanted to use the DataLoader or DataBlock API. Is it possible to apply something like np.asarray() as a batch transform? Or is there a better way to do this?

I was going through the DataLoader documentation and found that using batch_sampler is mutually exclusive to :attr:batch_size, :attr:shuffle, :attr:sampler, and :attr:drop_last, can someone explain to me how is this working?
PS: mutually exclusive means that the two or more events cannot coincide