What is the learning rate?

Hi everyone,

I just go to the point: I cannot understant what the learning rate is. Do you have please any articles, links or anything where they explain the concept in more detail?
I am on the second lesson and now I want to understand exactly what the learning reate and the loss are.

I mean, ok… I understood the LR is how fast the sistem is “learning”. But… What are the units of measurement for the LR (time?) and Loss? And… why the loss increase when the LR time increase?
I hope this is not a too-stupid question…

Thanks in advance!

Hi David, I found reviewing Lesson 2 notes very useful. Start from the bottom where the title is " What is the need for learning rate [1:41:32 ]"

Loss is basically how far away is our prediction compared to the actual value. For the above example in Lesson 2 where Jeremy is trying to fit a line to a bunch of dots, if you pass in x=2 and your model predicts that y should be 3 when the actual y value (where the dot is) is 4, then your loss can be calculated using those two values with the MSE equation. In the above post, there is a section titled " Loss function [1:28:35 ]" for further reference.

Loss does vary when you’re training it because LR varies. Fastai uses something called the ‘Cyclical Learning Rate’. This means that your learning rate starts off small, increases to a maximum LR (which you pass to fastai), and then decreases again. It’ll look tent shaped: /\

This cyclical LR affects losses so that the losses start off large, and then decrease. The intuition for doing this is that you want to explore the function space better in the beginning, that’s why you have an increasing LR. Then you decrease LR so that you can zone in on the more flat areas

1 Like

Thank you very much @newvick! I’m right now taking a look at the notes and the video you mentioned. And I’m going through the ypn this evening.

In any case I think I’ve got the point thanks to your explanations. I sometimes need to “visualize” the concepts to fully understand them.

Thanks again!