Any body know why both accuracy and training loss are decreasing when decaying the learning rates using fit_one_cycle?

I use one cycle to training my model. it seems that this is not overfitting as the training loss also decreases. what’s the reason why reducing the learning rate hurt the performance?

learning rate
training loss