I am new to fastai and I was trying to build a simple image classifier. When I create a databunch in order to load the data and subsequently use show_batch to see the images that are loaded, I can see that the top and bottom portions of the image are cropped when I explicitly asked for no transformations on both the train and validation datasets.
Please refer this screenshot.
As seen, when I am creating an ImageDataBunch from a folder (it has Imagenet type directory structure) I explicitly mention the ds_tfms argument to be a list of two Nones, yet I get a cropped image.
Can anyone please elaborate why is this happening and whether the images are really cropped when loaded or is it only when displaying that the images are cropped?
The image is zoomed in because of Augmentation of data by default in fast.ai
Instead of passing tfms=[None, None],
try to pass
Also to learn more about Image augmentations, and how they work, check out this kaggle kernel: https://www.kaggle.com/init27/introduction-to-image-augmentation-using-fastai
Thanks for the reply. I tried doing as you suggested but I am still obtaining the same result. Here’s a snapshot for clarification:
I think there’s some other issue because I am not asking for the images to be augmented explicitly…
Let me know your thoughts about the same.
You actually are with the fact you’re passing in an
IM_SIZE. The default resizing is doing the cropping for the resizing.
Thanks for pointing me to the right cause of the problem. I looked here and then realized I need to pass a tuple of (size, size) in order to resize it without cropping.
I was able to then get the databunch to load the data as I wished.
Grateful to you for helping:)
Great debugging and resourceability! Glad you got it solved