Developer chat

Hi guys,

I am new to open source development, but would like to start with something small. I was reading through the code for lr_find and found a variable being passed to learn.fit, and it was named a here. The variable is being passed as learn.fit's first parameter and that parameter is named as epochs here in the method definition. So, I thought it would be good to rename it in lr_find as well to epochs.

This I thought would be good chance to learn the process of submitting a PR. Let me know if its okay to submit a PR for this.

Thanks

You can certainly do that.
Note that following the fastai style guide, we usually give very short names to variables that are used for a very short amount of time (here just one line), which is why its named a.

Is it good practice to split models for lr in create_cnn

learn.split(ifnone(split_on meta[‘split’]))

even if pretrained is set to False ?

It won’t hurt you: if you pass a single learning rate, it is used for all groups.

1 Like

I am trying to figure out a way to add AdaptiveLogSoftmaxWithLoss as the last layer on a language model. I thought I had a way to get it to work with the callbacks but I feel like I am stuck, again.

I think the problem is from having the output and the loss tied up in the same function. The loss_batch() in basic_train.py seems to need for them to be separated. I thought I could use on_loss_begin to extract the tuple with (out,loss) from the final layer and then apply that loss. It also looked like I could use on_backward_begin(loss) to process the yb from that batch. But I still can’t quite pull it together.

Has anyone else looked at trying to add this layer? If so, any ideas/pointers?
I also realized, as I write this, that we could have our own AdaptiveLosSoftmax layer and a Loss that are not inter-connected. It would mean we loop twice rather than once, but might be cleaner to fit into the framework.

Any ideas/suggestions are appreciated.

8 posts were merged into an existing topic: AutoLRFinder

Hi.
(I’m using fastai 1.0.45 on Windows 10) When I run the lesson7-wgan.ipynb notebook, I got the following error (see other post with the same problem):

NameError: name 'split_bn_bias' is not defined

Fixed in master.

2 Likes

Thanks Sylvain. I changed in my local lib the fastai/callback.py file but now I got another error:

AttributeError: 'list' object has no attribute 'parameters'

Facing the same issue in colab:
name ‘split_bn_bias’ is not defined

I have no idea where this one comes from, I’ll need a reproducible example to fix it.

(I’m using fastai 1.0.45 on Windows 10) The error appears when I run the lesson7-wgan.ipynb notebook.

I am using Google Colab and installed the library using https://course.fast.ai/setup/colab
In the notebook lesson7-superres-gan.ipynb, following error is thrown while calling fit() on GANLearner:

I think learn.get_preds() isn’t CLI friendly. it sends \r which impacts user’s console outputs. For example, this test we have:

def test_get_preds():
    learn = fake_learner()
    a = learn.get_preds()
    assert learn.data.batch_size == len(a[1])

was resulting in the test name disappearing (see the first PASSED is lacking its name?)

collected 3 items                                                                                                                                                          

PASSED                                                                                                                                       
tests/test_basic_train.py::test_save_load PASSED
tests/test_basic_train.py::test_save_load_mem_leak PASSED

I had to capture its stdout to fix that:

def test_get_preds():
    learn = fake_learner()
    with CaptureStdout() as cs:
        a = learn.get_preds()
    assert learn.data.batch_size == len(a[1])

now we get:

collected 3 items

tests/test_basic_train.py::test_get_preds PASSED
tests/test_basic_train.py::test_save_load PASSED
tests/test_basic_train.py::test_save_load_mem_leak PASSED

but perhaps it’s ok and we just need to document this side effect and how to overcome it?

Ok, I was stupid with my first fix, now it’s really fixed on master.
@nandakumar212 fixed on master means you won’t have the fix in colab until the next release, unless you do a dev install.

1 Like

Sylvain, many thanks. The lesson7-wgan.ipynb notebook works well now in master with your change in fastai/callback.py.

I was running into issue while experimenting with object detection models. None of the methods Learner.predict, Learner.pred_batch, Learner.get_preds works. The one method that works is Learner.show_results. I saw a TODO by Jeremy back in December about refactoring the code there to work with pred_batch(reconstruct=True). The use of attaching RecordOnCPU callback to capture the input and target is also rather unintuitive to follow. Is there any new thinking/progress on this? (I have to admit, the library’s intricate design is powerful but also quite formidable for a newcomer to grok. Awesome work and kudos nevertheless!)

Yes, making object detection work end to end is on our TODO and will be done before the second part of the course begins, but it probably won’t fully work right now. I’d stick with calling the models explicitly for now, until we have sorted this out.

2 Likes

I have written a MultiTask API for one of my projects. It extends the DataBlock Api. I don’t know if it worths to add it to the library.

3 Likes

That looks very nice! It may be a bit specific to be included in fastai, but we can definitely link to it with other examples of custom ItemList or custom model definitions.