What is the relationship between epoch and batch size? How to set batch size correctly?
At 1:19 the teacher is talking about epoch and batch size, at each epoch we take a batch size of 64…
I noticed I was unable to plot the learning rate learn.sched.plot() until I set the batch size to 6 for my 200 images (100 each of each type) with a setting of 75% training 15% valid. When I inspected the current batch size learn.data.bs it was already set to 64 before I changed it for my dataset.
Thanks for clarifications!