Algorithm to estimate training time

Before starting a new machine learning side project, it would be very useful to estimate how long it will take to run 1, 10, 100, 1k epochs.

Given the variables below, can you recommend any heuristics that could provide an estimate?

  1. Problem type (e.g. Image Segmentation)
  2. Model type (e.g. Fastai Unet)
  3. Dataset (e.g. 10k images, 512x512)
  4. Compute (e.g AWS p2.xlarge)
  5. Library (e.g. FastAi)

Is an empirical method (e.g train on smaller subsets of the data and scale accordingly) a better approach to solving this problem?

1 Like

I posted the same question in StackExchange in case anyone is interested.