Fastai v2 chat

I don’t see why.

Try it with a regular python list and see what happens :slight_smile:

1 Like

I guess I wasn’t clear. When you were demonstrating this new list behavior, it reminded me a little bit like numpy arrays (ex: indexing with lists) and multiplying and adding numpy arrays behave differently.

You were quite clear. I still suggest you try it with a regular python list and see what happens!..

Yes I did and I see it matches python list behavior. It’s just that when you demonstrated it it reminded me of numpy arrays.

Thanks for the clarification.

1 Like

Note that python list and fastai L can hold any kind of data, not just numbers. So element-wise arithmetic doesn’t make sense for either + or *, which are two different ways of extending lists and work the same for L and list.

3 Likes

Type dispatch by parameter, and conversion of a function’s output tuple to a specified class – these are amazingly useful semantics.

Is this a built-in capability of Python, or something that the fastai developers have invented and added to Python? I can’t find an explanation in the Python docs.

Thanks for pointing me to where the magic happens.

This is fastai v2 functionality only.

So it’s implemented invisibly using some sort of code introspection? Where, how?

(I must be one of those “middle-level” Python coders.)

You got me curious, so I looked through the code a little further. It is over here:

There is a TypeDispatch class that is later used in the Transform class which is used to achieve the behavior that was demonstrated in the walk-thru.

And don’t, I am also an intermediate Python coder as well and this is a new experience for me too!

Don’t worry, we will get to this in the walk-thrus eventually! :slight_smile: But curious minds are most welcome to dive in before then, of course…

1 Like

Fastai is extensively used in kaggle competitions. Would it make sense to write functanility similar to untar_data which can use kaggle api and make the download easier.

3 Likes

I am trying to understand load_image function. Below is the code

def load_image(fn, mode=None, **kwargs):
    "Open and load a `PIL.Image` and convert to `mode`"
    im = Image.open(fn, **kwargs)
    im.load()
    im = im._new(im.im)
    return im.convert(mode) if mode else im

It is in vision.core, What is im.load() and im._new(im.im) doing. Checked the current version of fastai open_image function which does not contain these lines.

Those are Pillow functionalities, docs are here but essentially .load() allocated storage and loads in our pixel data for our image, and then _new creates a new image based off that loaded data.

I went through the docs, what I did not understand is why it is required. Removing those 2 lines did not create any difference.

Jeremy, I’ve noticed that when an object of class L holds data of various type, an attempt to sort it fails:

l = L([1,2,3,'4'])
l.sorted()

TypeError: '<' not supported between instances of 'str' and 'int'

I guess this is a valid behavior, and type checking or type transformation is undesirable/unnecessary. Right?

It’s matching the behavior of list

I’m not sure of a better way to handle it, although I’m open to suggestions.

If we don’t do this, then we can’t subclass Image. I don’t recall the details - but I got an exception from PIL otherwise.

1 Like

Thanks.

How do you open notebooks from the cloned to Colab repo, I wonder?

I navigate to the notebook via “Open from Github” and so long as I import the library it works well. I’m working on an example notebook but am waiting for when a fully trainable PETs notebook is working (debugging one last thing right now)

1 Like