The Difference between Presizing and RandomResizedCrop + Data Augmentations

Hi! I’m a bit confused about the difference between simply just applying a RandomResizedCrop along with aug_transforms on a DataLoaders, and Presizing,
Also, I’m not certain when to use Presizing and when I should avoid them, is there any kind of general rule?

(I didn’t find any topic, or paper discussing this topic, that’s the reason why posted this topic.)
Cheers!

Hey There,

so the distinction is a bit hard to grasp, make sure you read the fastbook section on it for a in-depth explantation.

But basically the idea is to have to steps:

  1. We resize the image to a (larger, e.g full height) defined size
  2. Then do our data augmentations on it, without losing information (no black borders)

Or as it is described in the book:To implement this process in fastai you use Resize as an item transform with a large size, and RandomResizedCrop as a batch transform with a smaller size."

As to when to use it: I would say always, loosing information by Augmentation is never a good idea.
Cheers!

Thank you, it makes much more sense now! :slight_smile:

Another point is that if you want to do incremental size training it is a good idea to presize the data offline (e.g call resize_images on the full dataset) to increase training speed and maximize GPU utilization, resizing is expensive.
I generally build a set of folders: images_64, images_128, etc… to iterate fast and do the incremental training.

1 Like

Presizing saves a lot of time through removing I/O overhead and not having to resize everything every time, but in terms of model accuracy, online resizing (e.g. RandomResizedCrop) gives you a slight edge since it acts as a weak form of augmentation by exposing different parts of a picture to the model during each epoch. But again, presizing is usually much faster, so if I have a lot of data, I tend to go with it rather than online resizing.

Data augmentation should be applied after resizing (whether on the go or not) and typically gives better generalization.

1 Like