Batch size setting in the new fast.ai Framework?


(Stephen Lizcano) #1

I’m rather used to batch_size coming from Keras and such, in the new framework here I can’t seem to find where it is controlled. Is it dynamic? Or all just mini batch = 1?

Thanks!


(Sanjeev Bhalla) #2

When you set the dataset you can specify a parameter called bs (batch size). Look in dataset.py in dl/fastai dir


(yinterian) #3

Use the argument bs. See an example below.

data = ImageClassifierData.from_paths(PATH, tfms=tfms_from_model(resnet34, sz), bs=32)
learn = ConvLearner.pretrained(resnet34, data, precompute=True)
learn.fit(0.01, 1)

(Stephen Lizcano) #4

Oh duh and he mentioned it in the lesson too. Just remembered. Thanks!


(Matthew Arthur) #5

Is it possible to set the batch size on tabular data? I have a very large dataset I would like to train and I’m running out of memory.


(Prajwal Prashanth ) #6

As a argument in the databunch call