I am new to fastai and I was trying to build a simple image classifier. When I create a databunch in order to load the data and subsequently use show_batch to see the images that are loaded, I can see that the top and bottom portions of the image are cropped when I explicitly asked for no transformations on both the train and validation datasets.
As seen, when I am creating an ImageDataBunch from a folder (it has Imagenet type directory structure) I explicitly mention the ds_tfms argument to be a list of two Nones, yet I get a cropped image.
Can anyone please elaborate why is this happening and whether the images are really cropped when loaded or is it only when displaying that the images are cropped?
Thanks for pointing me to the right cause of the problem. I looked here and then realized I need to pass a tuple of (size, size) in order to resize it without cropping.
I was able to then get the databunch to load the data as I wished.