Precompute vs. Freezing

Hi! Could you help me to explain why data augmentation does not work with precompute=True? For example, an image A has been passed through the network to cache all the calculations for activations. Now with augmented images A1, A2, the network should automatically know they are different from A and calculate the activation values again, shouldn’t it?

1 Like

In https://github.com/fastai/fastai/blob/master/courses/dl1/lesson1-vgg.ipynb

In [7]: learn.precompute=False
In [8]: learn.fit(1e-2, 1, cycle_len=1)

I think In [7] is not necessary here? it should be after In [8] .e.g:

In [7]: # learn.precompute=False
In [8]: learn.fit(1e-2, 1, cycle_len=1)
In [9]: learn.precompute=False

UPDATE: After reading lession1 again; In [7] learn.precompute=False is necessary to traint the last layer using the augmented images (or learn.precompute=True ignore agumented images)

Totally agree. I’m not a fan of abbreviated variables either.