How do you estimate how long training / testing is going to take?


Hello All, I am still a substantial noob with deep learning and related coding. That said, my main problem at this stage is not being able to predict how long training / testing is going to take at large. I am aware of data sizes and networks’ complexity, number of parameters and operations, but the deeper I go, the less control I have overall. With using hyperoptmisers, it goes even worse: my local rig with a GTX 1070 8GB, Win 10 and 16GB RAM starts computing and I can’t say if it is going to take 20 sec, 2 min, 20 min, 2 hours or even more. Is there a practical way (a progressive tutorial, your direct experience, whatever) to help me form a clue and make an informed guess before I hit the start button? Thanks in advance and happy coding! Gius


Ok I got a bit of sensitivity for this problem from single-epoch experiments with different datasets and different image sizes. Now trying to optimise for the lowest stress on GPU and CPU in order to keep my rig temperature under control and avoid thermal throttling (applying new thermal paste is too easy…) EDIT: easy peasy this again, the gaming settings of my laptop being perfect while using stable wrappers like keras on top of state-of-the-art networks and conservative image sizes like 64x64