I can calculate gradient decent for any number of dimensions, but how does this apply to machine learning

I have looked at gradient decent they say that it is the most important part of machine learning. I am a little confused about this. I understand back propagation and I have looked at linear regression. This is something that I think people are looking for and I have it.

Not sure if this is what you are looking for but…

In machine leaning, gradient descent is an optimization strategy/algorithm.

  • The strategy attempts to minimize the error/loss calculated using the error/loss function at the end of the forward pass.
  • The back propagation step adjusts the weights of the network to reduce the loss
  • Adjustments are made in the direction which will make the loss decrease determined by the derivative of the loss function wrt the input.
  • Small adjustments are repeatedly made until the error/loss is at an acceptable level.

This post has some interesting links related to gradient descent.

1 Like