2 Optimizer on parameters

Hello everyone,
If I work with two optimizers in my code on parameters, how cand I reduce the complications?

optimizer_left = torch.optim.Adam(self.parameters1, lr=self.learning_rate, betas=(0.5, 0.999))
optimizer_right = torch.optim.Adam(self.parameters2, lr=self.learning_rate, betas=(0.5, 0.999))

Because I get an error a lot of times:
RuntimeError: CUDA out of memory. Tried to allocate 2.00 MiB (GPU 0; 15.90 GiB total capacity; 15.05 GiB already allocated; 3.81 MiB free; 154.34 MiB cached)

Thanks for the help.