Part 2 Lesson 10 IMDB / number of frozen layers doesn't affect the training speed


I am following the imdb notebook to learn about transfer learning for later use in classification.
At the end of the notebook, we gradually unfreeze more and more layers.

I was expecting that the number of iterations/s would decrease (more parameters to optimize at each step), but I observe about the same training speed whatever the number of layers frozen.

I’d be happy to understand what is going on here.