`apply_step` function in 04_minst_basics notebook

Hello. In the End-to-End SGD Example in notebook 04, there is a function defined:

def apply_step(params, prn=True):
    preds = f(time, params)
    loss = mse(preds, speed)
    loss.backward()
    params.data -= lr * params.grad.data
    params.grad = None
    if prn: print(loss.item())
    return preds

This function take the initial parameters as inputs, does one step of optimization, prints the new loss function and returns new parameters. Also, there is a global variable params defined earlier:

params = torch.randn(3).requires_grad_()
params
Out: tensor([-0.7409,  0.3618,  1.9199], requires_grad=True)

When I try to call this function 10 times like that:

for i in range(10):
    apply_step(params)

and check the value of a global variable params, I see that it is now different than before!!

In: params
Out: tensor([0.1022, 0.4145, 1.9254], requires_grad=True)

I am a little bit confused why does this function modifies the value of global variable params?

Also, when I try to simulate this situation with something less complex:

x = 5

def f(x):
    x += 1
    print(x)

for _ in range(100):
    f(x)

I get one hundred “6” printed inside my console. This means that the x is not updated during each iteration of the loop… That seems entirely different from the situation described in the question

First, this function doesn’t return new parameters. It returns predictions of the model. Second, in Python, when you pass to a function a mutable object such as a list, dictionary, etc. (in your case, it’s a PyTorch tensor), the function can change the contents of this object.

In this case, you pass an integer to your function. In Python, objects like booleans, integers, floats, strings, and tuples are immutable. So the function can’t change them.