Bs and size parameter value into ImageDataBunch

Can anyone please explain me why size and bs value being modified resnet 50 in lesson 1? resnet 34 had different values for these parameters. Are these hyperparameters?

data = ImageDataBunch.from_name_re(path_img, fnames, pat, ds_tfms=get_transforms(), size=320, num_workers=0,bs=bs//2)

Thanks

Bigger size will make model see bigger and better images with more details . Resnet50 is large architecture so may not fit in GPU RAM so batch is made half. Check github repo.

yes.

1 Like

More img size(more inforamation) -> more memory -> less batch images per batch ->proper GPU Utilisation

Image size is kind of, not sure about bs as thats primarily used for GPU Utilisation check, isn’t @PoonamV?

No. Smaller batch size because more images per batch may cause GPU out of memory error.

ImageDataBunch has following with resnet34. So this means bs seems to be a parameter and getting affected by network depth?

data = ImageDataBunch.from_name_re(path_img, fnames, pat, ds_tfms=get_transforms(), size=224,num_workers=0, bs=bs)

Ya bs is batch_size, how many images to pass at a time in memory. Ya it does get affected by network depth as well as image_size i.e size parameter here…

Thanks…this explains why lesson 1 notebook has higher bs with resent34 and lower bs wth resent 50 when he used transfer learning for classification. One question and this may be bit off from this one…Can’t we use gridsearch method here to find out optimal bs size?

there are not many values to search. It’s trial and error.
You generally decrease the size to 32 or 16 or maybe 8 if you get OOM error. Default is 64

I need some help to see how bigger size will make model see bigger and better images with more details. I tried ImageDataBunch.from_folder with different sizes and I keep getting the same values for train_ds.x[3].data.size(). Any pointers will be greatly appreciated!