Hello,
I am following the imdb notebook to learn about transfer learning for later use in classification.
At the end of the notebook, we gradually unfreeze more and more layers.
I was expecting that the number of iterations/s would decrease (more parameters to optimize at each step), but I observe about the same training speed whatever the number of layers frozen.
I’d be happy to understand what is going on here.