They won’t be scaled down without data loss. As @marcmuc said, you’ll lose potentially useful info when scaling down so much.
The overload error is probably because those images are way too big. Instead of downsizing to (100,177)
, here’s what you should do – and this is customised to your problem.
height = 1400
width = 1900
aspect_ratio = width / height
# 1900 / 1400 ≈ 1.36
def to_img_size(base_size):
return (base_size, int(base_size * (1900/1400)))
to_img_size(1400) # returns (1400, 1900)
to_img_size(400) # returns (400, 542)
to_img_size(100) # returns (135)
Maintaining the aspect_ratio
of your data ensures that it isn’t squished or morphed in unexpected ways. As for the size of images, you are really only limited by the size of your GPU memory. Play around with different sizes to see what works (if the image is too big, you will get an error)