I noticed that the batch size is always a power of two (and not only in fastai notebooks).
Does this give any real advantage, or it’s just a matter of habit?
Thanks.
I noticed that the batch size is always a power of two (and not only in fastai notebooks).
Does this give any real advantage, or it’s just a matter of habit?
Thanks.
It’s probably the computer scientist’s obsession with powers of 2. As long as it fits on the GPU, you can use any batch size you want.
Good to know, thanks!