Image normalization consumes too much RAM

I am doing an image recognition task, I converted the input images to a list of array of pixels. When i try to normalize it , I am getting a Memory Error

Does this work?

X_M = np.array(X_m)
x_M /= 255

That works fine but I always get a memory error when I try to split the data

Maybe converting numpy array back to python list might help.

When you divide image by 255 numpy converts it from uint8 to float32, which takes 4x as much memory. That’s why it is more memory efficient to store images in raw uint8 format and normalize them one at a time (in Dataset)