Yes the number of epochs is a parameter that you have to tune manually, along with the Learning Rate. But while the choice of LR can be tricky and requires some experience (to correctly read the lr_find
graph), there is less room for mistakes in the choice of the number of epochs.
Indeed, it is difficult to make your model overfit due to a too high number of epochs (see Lesson 2 where Jeremy had to remove all regularization and data augmentation to get close to something looking like overfitting), so the worst that can happen by choosing a too high number of epochs is that you may loose your time (and money if you use cloud services), but it won’t hurt your models so much.
Again, fastai also supports Early Stopping, so maybe choose a high number of epochs (say 100) for your different models and make them stop if the loss isn’t improving for a while, just in case