Why the memory usage increase during a training epoch?

Hi, dear Jeremy,
I am using fastai code to solve a problem with a lot of big photos. But, unfortunately, the memory usage will increase during an epoch an finally the RAM memory get full and the training process freezes. This is indepenent of batch size.
As an example, at the begining of an epoch the In use memory is about 4G, and after some iterations it reaches to 32G. What is wrong? I think the memory usage should remain approximately at a fixed level during training as we use mini-batches.
Many thanks in advance.

1 Like