Fastai v2 chat

Yes. I guess you were lagging before because you did not have the editable install.

I had the editable install for a long time now. I will monitor that and I will let you know if there is something.

Thank you again

In the dev.fast.ai install section (http://dev.fast.ai/#Installing) It is not clear what ā€œfastcoreā€ is and how to link fastai2 repository to fastcore. Shouldnā€™t fastai2ā€™s master branch already have the latest fastcore master?

These are two separate packages. When you are using a version of fastai2 thatā€™s released, we can control the minimal version fo fastcore you have through requirements, but for an editable install, we donā€™t have a mean to do this automatically, so you need to make sure to pull them both.

Especially when we are working on changes that impact both packages at the same time like right now.

2 Likes

Can I make my network output a specific type of tensor? e.g. TensorImage

Iā€™m doing style transfer and I have a Image-->Image scenario. Of course I tried just wrapping the output of the model with TensorImage(pred) but this seems to break gradient flowing.

What I want to do is take the model output and apply some TensorImage specific transforms to it (mainly Normalize). Iā€™ll then take this result and feed it to the feature model (standard to the style transfer technique). I need the gradients to flow all the way back to the model, but wrapping with TensorImage seems to break the chain.

1 Like

Yes, our custom types do no support gradient computation. Youā€™ll have to add that transform manually.

2 Likes

Is there still functionality in show_batch to display the image axes? Looks like hide_axis=False doesnā€™t work anymore. Itā€™s difficult to tell if thereā€™s a new argument with the kwargs.

Also I saw a twitter post about automatic normalization based on whether using pretraining or not. If I set pretrained to false, does this mean it will automatically calculate normalization stats based on my dataset?

1 Like

Is there a reason for this, or is it just in a long ā€œtodoā€ list?

The subclasses need to create new tensors to work so there is no way to have gradients computation supported. At least until PyTorch implements OO tensors properly.

2 Likes

No there is not, hide_axis is hard-coded to False, so you need to create a new image type and its show method, or monkey-patch the show method of the type you use.

If you pass the Normalize transform in your batch transforms, it will compute the stats on the first batch.

1 Like

Iā€™m no expert in the internals of pytorch, so pardon my ignorance

The gradients of a tensor are stored as an attribute right? Can we, when copying the tensor to subclass, just create a reference to the original gradient?

No the gradients are stored as an attribute once they are computed. Creating the new object of a subclass removes the history of its computation. Thatā€™s what makes the gradient compute fail when creating new objects.

In general, the whole tensors OO is done for preprocessing.

1 Like

Now that we are getting into this subject, just for curiosity, can you briefly explain why we need to create a copy when subclassing to the new type?

Looking at TensorBase and cast it seems the only thing we need to do is change res.__class__ = typ. Obviously this is not the only thing that is happening, because the tensor in being copied in reality as your said, but Iā€™m failing to see where

You were the one saying it broke the gradients computation :wink: I know no more than you do, just saying what we tested and add working for us.

1 Like

Yeah lol, what I understood from the previous answer was that fastai was the one creating the copies and breaking the gradients. But itā€™s actually pytorchs faults that this happens

I wonder if a composition instead of inheritance approach to the custom types would get around this issue

A minor thing but have not seen it mentioned before - importing all.py modules seems to conflict with the built-in all() function:

1 Like

I may be blanking here, whereā€™s the equivalent of c2i in our DataLoaders?

Edit: lives in vocab

If you were a fan of ClassConfusion in fastai1, Iā€™ve ported it over to v2:

Currently it just supports Colab but working on bringing it to native jupyter. It supports Image and Tabular classification. If you donā€™t know what it is, see here:

https://docs.fast.ai/widgets.class_confusion.html

7 Likes

Nice contribution! Really useful tool, pity I work mainly on multi-label problems.

1 Like

I made it modular by design so I could eventually support more. How would you except multi-label behavior to look like? :slight_smile: Iā€™d assume similar to how images is except you can have a number of combinations present?