Batch size param not working

hi i see that when i pass batch size param to data = (src.transform((trn_tfms, _), size=512,bs=16,num_workers=1)
.databunch().normalize(protein_stats))

Number of batches are still remaining same as that with 64. Because of this my gpu memory is getting overun…

number batches with bs=64,417 when i change to it 16,24 number of batches should increase in progressor but that remains still same.

my bad… i was not giving the bs param for databunch which is what controls the bs

I had the same problem, looks like you figured it out but here is where you can declare the batch size explicitly ( from the planets notebook in v3 lesson 3 ) :

 data = (src.transform(tfms, size=256)
           .databunch().normalize())
 bs=16

 data.batch_size = bs