Bias in lgbm output that grows over number of trees

hey!

i am training an lgbm regressor on targets that have mean zero. i would expect this to yield an average prediction that is also distributed with mean zero. however, i weirdly observe that the absolute value of the prediction grows with the number of trees. now, i would like to ask what could be the possible causes of this behavior, and how to get rid of this bias? maybe the outputs of the individual trees can be modified on the tree level to get rid of the bias? however, i cannot just subtract the mean of the ensemble output from the output, as the output itself is the mean of the ensemble, right? thanks for any advice!