Lesson 2 - Non-beginner discussion

Hi everybody,

Has the get_transforms() function been totally removed? That was handy to add a set of default transformations.
Is there any other way to add default transforms?

its aug_transforms() in fastai2

1 Like

Oh okay. Thank you :slight_smile:

About catastrophic forgetting.

I was recently (well, still currently) learning about optimization of neural networks in federated learning.

In federated learning the training is done with (usually) mobile devices that hold their own private training data and then the results are combine to form a global model. The issue I was investigating was how the non-IID nature of the data (for example many phones have very different images) affects the learning process.

So the problem is when learning on a device that has a lot of pictures of birds and on a device how that has a lot images of buildings how can the models be combined so that it actually learns to recognize both birds and buildings.

Here’s the paper I’m referring to:

So they use something called Elastic Weight Consolidation (EWC). The basic problem that EWC is trying to solve is how to learn tasks A and B sequentially using a single model. The idea is to recognize the weights that are important for task A and then when learning task B add a penalty to the loss function for modifying those weights.

For some reason I did not see the connection at all to transfer learning before Jeremy explicitly forcing it into my knowledge during Lesson 2.

So that got me wondering: would it be possible to apply EWC to transfer learning?

In what kind of scenarios does the “catastrophic forgetting” happen in transfer learning?

I’m having trouble coming up with anything that would not actually require replacing the output layer (and thus throwing away the ability to detect ImageNet after learning cats and dogs)…