Out of memory for larger image size

Hello everyone,
I am working on this dataset: Chest Xray pneumonia. I used resnet34 pre-trained model and got 78% accuracy. So for increasing this accuracy, I moved to next model i.e. resnet50 and got the accuracy of 83%. For all these models I kept image size to be sz=224. As this dataset consists of Xray images, so I thought increasing the size of the image to sz=500 (each image in the dataset has variable dimensions) would be a better decision as the model will be able to get more information (more insights rather than small image) from this dimension (500x500) rather than (224x224) am I right?. After changing the size of the image to sz=500 I got out of memory error. How to resolve this error? can a pre-trained model will be useful for this kind of dataset or not or should I follow completely different approach?

When you are increasing the size of the imageā€¦ the memory it occupies increases. GPU has fixed Memory. When you keep increasing the size, you run into out of memory error (Assuming Other parameters are not changed).To Avoid Out of Memory Error, Reduce your batch Size accordingly.
i.e Image memory (sz * sz * Channel) * Batch size <= GPU Memory

For your Transfer Learning Question, When you try to transfer imagenet weights it is similar to satellite image competition discussed in lesson 2. please refer to lesson2 for detailed explanation, btw one tip that i recollect is train on smaller image size and then gradually increase the size.
P.S : Do Check if there are any pretrained weights wrt Xray Datasets are available.

Thanks @gokkulnath. Yes I have got the solution to this error by decreasing the batch size to 28 and achieved an accuracy of 93.75% using resnet34 model.

Also will try to train on smaller image size and then on larger sizes. Thanks.