How to find correct training time for one epoch?

I have been training DNN model in two different configurations. In first configuration, I train whole DNN model. In second configuration, I train only Fully Connected layers of the model. In theory, training only FC layers should take less time as compared to training of whole DNN model for any given number of epochs. However, I am getting almost equal time when I train the model using ‘Learn’ from Fastai v1. Moreover, the parameters are frozen also. I have checked it through summary of ‘learn’.

Screenshot for training of complete DNN model:

Screenshot of training only FC layers:

Hey Awais,

It’s probably better to use fastai v2 if you have the option, as it’s a complete re-write from scratch and designed to be superior to v1.

In general, it’s pretty common for the CPU or memory copies between the CPU and GPU to be the bottleneck for very short epochs (such as the ones we see in your images), and so you won’t find a big difference in training times for different networks when the epochs are so short. If you increase the size of your dataset for example you will probably see a larger discrepancy in epoch times.

1 Like