As the name indicates, one cycle doesn only one cycle. tot_epoch is just an argument that stops training before the end (I’m guessing for when you resume training at a certain epoch).
You want to mix two incompatible learning rate schedules as far as I can see. Once you have decided on a given LR scheduler, the GeneralScheduler API can implement it, but you can’t have reduce on plateau with 1cycle together.