Batch size and powers of 2

I noticed that the batch size is always a power of two (and not only in fastai notebooks).

Does this give any real advantage, or it’s just a matter of habit?

Thanks.

It’s probably the computer scientist’s obsession with powers of 2. :smiley: As long as it fits on the GPU, you can use any batch size you want.

3 Likes

Good to know, thanks!