Fastai v2 chat

The result of amazing work @boris has been doing for a while means you need to make sure you have the latest fastcore to run fastai2 (as is always good practice). So if you see an error log_args is not defined while trying to use fastai2, make sure you have your editable install of fastcore and are up to date with master.

1 Like

@log_args(but=‘dls,model’)

what is the functionality of log_args ?

thanks

It adds the args you used into self.init_args.
The idea is to easily debug when you have a problem by seeing all the args that have been used.
Also it can be used by logging callbacks (eg WandbCallback) to be automatically logged as config parameters.

4 Likes

It is going to be really sweet once finished, I’m pretty excited about it :slight_smile:

3 Likes

I see this error too and I am installing like this:

pip install git+https://github.com/fastai/fastai2.git
pip install git+https://github.com/fastai/fastcore.git

Solved with next order:

pip install git+https://github.com/fastai/fastcore.git@master
pip install git+https://github.com/fastai/fastai2.git@master

@sgugger maybe it is a good idea to update the FAQ because it has the order that caused me the error. Fastai-v2 FAQ and links (read this before posting please!)

You might be interested in other approach for Siamese. One samples pairs of the same class in the dataset, pack in batches and negatives are mined in the batch. No specific class needed.

I have implemented it for patch descriptor learning:

3 Likes

Btw, is there a way to skip computing validation loss?

You can raiske a CancelValidation error (can’t remember the exact name but you should find it easily) that will skip the validation phase.
If the questions is skipping loss but still computing other metrics, the answer is no however.

I always use

pip install git+https://github.com/fastai/fastcore.git --upgrade
pip install git+https://github.com/fastai/fastai2.git --upgrade

which is the better approach?

1 Like

I don’t know, I just added the branch at the end.

1 Like

Skipping loss, but do val. Ok, I’ll do my callback then

So you’ve written imshow_torch to work with that, that’s pretty cool !

I wonder what would happen if you call _pre_show_batch on your dataloaders :thinking:

I was running lr_find and I see this printed

@log_args had an issue on LabelSmoothingCrossEntropy.__init__ -> missing a required argument: 'self'

It’s not throwing any error though, anything to be concerned about?

1 Like

I think that you need to install it again, they updated something and need last fastcore.

You should try to install it again:

pip install git+https://github.com/fastai/fastcore.git@master
pip install git+https://github.com/fastai/fastai2.git@master

I did, this happened after I updated it. Before installing the latest one I was getting an error.

It’s a warning you can ignore, we are working on fixing those.

1 Like

Can you share the notebook?
Even if it does not happen anymore I’d like to check it works as intended and you may have an edge case.

Here is a minimal example: https://colab.research.google.com/drive/1S1HyJs7a0ehwkltsBjGvRLDPSeOcXm0u

1 Like

Quick qeustion, is there a Resize (pre size) images method on V2? To resize all images before running the trianing?