Tabular: How to bend Validation Loss

After many experiments manipulating Embedding Dropout vs Learning rate, I have come to this pattern which has the validation loss at a sharp angle to the training loss at one point during the learning. What I am curious about is whether or not there is a way (which parameters) to further bend the validation curve to the training curve(drop out in the model is actually .25):
image



image

nm, I just had to adjustments between major learning rate increments. In this case, I had to lower to 3e-6.