Lesson 6:Need help in understanding the SGD code

I was going through the lesson 6 code but unable to understand the SGD code for classification.I am confused about how y_hat is calculated.
Can anyone explain what following three lines of code do?

    p = (-lin(a,b,x)).exp()
    y_hat = 1/(1+p)
    loss = nll(y_hat,y)

Complete cod is show in the picture.

We’re doing logistic regression here: that is, we’re calculating

yhat = sigmoid(lin(a, b, x))

You can write the sigmoid function as

def sigmoid(x):
    return 1/(1 + (-x).exp())

so that’s what’s happening in yhat = 1/(1+p).

The loss is then the “negative log likelihood”, aka the cross entropy.


Thank you :slight_smile: