Adam optimizer low accuracy

I have some doubts regarding adam optimizer…
when i use fine tuning as mentioned in the link…
http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html

and change the optimizer from SGD to adam… Accuracy seems to be in range of 0.1876
while using SGD the accuracy is around 0.60 to 0.87

can u suggest how to setup adam optimizer

It would help to see the values you are using for setting the parameters of both SGD and Adam. Generally, Adam is usually tuned by default using beta1 = 0.9, beta2 = 0.999, epsilon = 1e-08 and learning-rate = 0.001

maybe use lr_finder to find a good lr candidate?