Fastai v2 chat

On the ensemble subject, Does anyone know how to ensemble multiclass models. I understand @muellerzr’s approach of averaging the predictions in a binary classification problem but in a multiclass problem can this still be applied? From my understanding, if an average is done in a mutliclass problem, it’s possible we will end up with a float values instead of integers. How can this be avoided? Or am i totally incorrect?

What about averaging the output probabilities instead of the final predictions?

2 Likes

That’s the method I was trying to describe above as well (and what the code does) :slight_smile:

1 Like

I think in ensembles you are always ‘mixing’ (I should say… ensembling =P ) the predicted probabilities, optionally with different weightings (e.g. in two-model ensemble if I think model A is more accurate I can give it 60% weighting vs. 40% for model B output). It will make no sense to ‘average’ the class prediction integers – if 1=cat, 2=dog, 3=cow, two models that gave highest probabilities to cat and cow respectively obviously do not mean that the real answer should be dog!

I guess we can caveat this by saying: unless there is some form of ordinal/order in your classes where, say, a class 3 is really in between classes 4 and 5. But in that case, you probably want to do something different anyways, e.g. regression.

Yijin

I am looking for ways to handle imbalanced dataset for a text classification problem. So far many have pointed out to me that one should WeightedRandomSampler or Weighted Cross Entropy as the loss function.

Changing the loss function did not yield any benefits. So I am looking for ways to use Weighted Random Sampler as part of training data loader based on the approach described here.

Could you point to any examples in fastaiv2? The closest one I can find is the kaggle kernel by @ilovescience showing OverSamplingCallback. But this is not yet part of v2.

2 Likes

ilovescience did an example here: Oversamling in fastai2 (not sure if you saw this yet :slight_smile: )

3 Likes

@muellerzr - Well my digging in forum did not provide this gem :slight_smile: . Thanks for the pointer.
@ilovescience - Appreciate the crisp code example.

1 Like

not sure if you have seen this example notebook - https://github.com/fastai/fastai2/blob/master/nbs/14a_callback.data.ipynb

1 Like

I’m not sure weighted cross entropy is working properly? I passed weights of 100:1 and 1:100 for a binary classification problem and it made no difference whatsoever in the precision/recall of the final model. I had expected it would have skewed in either precision or recall direction compared to 1:1 weights…

1 Like

Hi @fmobrj75,

How did you end up resolving this issue? I am facing the same error.

The weights I created are only for the training data. I am still having challenges in using weighted_dataloaders.

Hello! I’ve got this data set of pictures I’m using the vision package with. I’ve got some pictures that have repeating patterns of the image in the corners

It’s really prominent in the bottom middle one. Does this hurt model performance?

My transforms look like this:

batch_tfms = [*aug_transforms(size=224, max_warp=0,max_rotate=360.0), Normalize.from_stats(*imagenet_stats)]
item_tfms = RandomResizedCrop(460, min_scale=0.75, ratio=(1.,1.))
bs=64

The docs have been moved to a new fastai-docs repo.

The docs dir in fastai2 is now a symlink to …/fastai-doc . So you should clone the docs repo into the same dir that you have fastai2 in.

3 Likes

Heads up for Colab users:

To use PyTorch 1.6 in colab, you need to do the following (and when installing fastai):

!pip install torch==1.6.0+cu101 torchvision==0.7.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html

(then of course pip install fastai2, etc)

If you’re running CUDA 10.2 then you just need to do !pip install torch torchvision --upgrade

9 Likes

are the colab preinstalled torch-1.5.1+cu101 and torchvision 0.6.1+cu101 incompatible with fastai2 ?

Yes. We use torch 1.6 now

1 Like

Hi everyone!

Just did a fresh pull and install of the fastai2 repo.

I am going through the 02_production notebook.
Ran into issues in the cell that introduces aug_transforms.

Seems the decode fails. The Flip class only stored its p attribute. But I had to store (basically) all the attrs for it to not complain that Flip has no attribute "x"

Here is what I had to store:
self.store_attrs = 'p,mode,size,pad_mode,align_corners'

I remember something about this came up in the live coding session (or am I misremembering?). Is anyone facing the same?

Thanks!

1 Like

This is currently being worked on (by me) so please keep that in mind :slight_smile: (Jeremy did not get all the way through it). Thank you for pointing it out though :slight_smile:

1 Like

Gotcha, thank you!

Hey just followed your instructions. I got this warning in collab:

/usr/local/lib/python3.6/dist-packages/torch/cuda/__init__.py:125: UserWarning: 
Tesla T4 with CUDA capability sm_75 is not compatible with the current PyTorch installation.
The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70.
If you want to use the Tesla T4 GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/

Any idea what is this ?
Also would i be able to use fastai2 on mac locally ?

@benihime91
I think he means If you want to use PyTorch 1.6:

To use PyTorch 1.6 in colab, you need to do the following (and when installing fastai):

I did test fastai2 by just pip install fastai2 and it works fine.

1 Like