Lesson 12 (2019) discussion and wiki

(Rachel Thomas) #1

Lesson resources

Software requirements

  • Some nbs of this lesson also currently require pytorch-nightly if yours got de-installed see this.
  • Notebook 10c (and subsequent) requires the NVIDIA apex python library. In the environment you created for fastai go to the fastai directory and run pip install git+https://github.com/NVIDIA/apex.

Papers

AMA about Swift at the end of class

Post your questions in this separate thread:

Notes and other resources

11 Likes

2019 Part 2 Lessons, Links and Updates
(Jeremy Howard (Admin)) pinned #2
0 Likes

(Pierre Ouannes) #15

Kind of a tangential question but since Jeremy mentioned it: do you have any tips on debugging Deep Learning models?

9 Likes

(Paul M) #20

Has mixup been successfully used in NLP yet?

0 Likes

#21

Not that I know of, but you should definitely try :wink:

2 Likes

(Vishal Pandey) #22

The Audio module won’t be covered today ???

1 Like

(Rachel Thomas) #24

Last week we shared an updated schedule. The audio module will be covered in an extra session that will be livestreamed once the course ends. We had more material than will fit in the 7 weeks of the course.

Edited to add: the dates of the extra sessions have not been set yet.

8 Likes

(Brad) #25

Has anyone tried mixup and normal augmentation like rotation/zoom?

Seems like they’d be different augmentations that could be used together.

1 Like

(Christine) #26

I tried playing around with mixup on NLP embeddings this past week and from early experimenting it seems to work well (maybe someone else has spent more time on it already!)

13 Likes

#27

We tried yes, and it’s the same as mixup without normal augmentation in our experiments.

2 Likes

(Brad) #28

Do you have an intuition for why that might be?

0 Likes

(Edward Easling) #29

How broadly can we apply mixup? Could you use it on an image regression problem?

0 Likes

#30

Mixup is much more powerful in terms of data augmentation, so it more or less erases everything else.

4 Likes

(benjmann) #31

on mac you can just type ctrl cmd space and type the name of the letter (ie ‘gamma’)

1 Like

#32

As long as there is a way to mixup your labels, you can try. It has been wildly experimented in single label classification, not so much in other areas.

0 Likes

(Alena Harley) #33

it seems mixup forces the model to behave linearly between training classes, why is this desirable?

0 Likes

#34

I’d reverse it into, why is this not desirable?

3 Likes

(Alena Harley) #35

adversarial attacks proofing I guess

0 Likes

(Sathya Iyer) #37

What about backprop with new loss?

0 Likes

#38

Some researchers actually found it helps against adversarial attacks (for a generalized version of Mixup): here.

4 Likes