What's the recommended batch_size

Hi All,

I read in the finetune code in lesson1 that the batch size is recommended not to be larger than 64, i didn’t get why 128 would be worse than 64, is there any intuition behind this recommendation?

Thank you,
Omar

As Far as I understood, you want the maximum batch size possible that fit into memory.
Therefore, it depends of your computer / ec2 instance.

1 Like

Working on the State Farm data, I tried to increase batch size from 64 to 128. After training was almost complete, I got an out of memory error running on the p2.xlarge AWS instance. Since my understanding is that powers of 2 are preferred for batch size, 64 looks like the sweet spot for the normal AWS setup.

1 Like

In lesson 11 @Jeremy cited this paper Systematic evaluation of CNN…
In it the authors recommend to use 128 or 256 as batch_size. I often run into problems with batch_size however, because my GPU’s VRAM is too small. In the same paper there is an interesting relationship between lr and batch_size which the authors cite and experimentally verify. In short:

new_lr = old_lr * batch_size/256

I found that quite remarkable.

4 Likes

Thanks so much for sharing