Confusion of params in lesson 4

Hi, friends.
I am study the lesson 4 and confused by the updating of params in train_epoch function.
like in the picture. we defined params=weights, bias
but when train the epoch in a for loop, we didn’t redefine the params. it confused me that why the params can update itself.
if i creat other variables like in second picture, it didn’t update automatically.
please help to explaine, thank you!


def train_epoch(model, lr, params):
    for xb,yb in dl:
        calc_grad(xb, yb, model)
        for p in params:
            p.data -= p.grad*lr
            p.grad.zero_()

train_epoch function recalculates params using learning rate and gradient

This is due to the way Python passes arguments to functions. In the first case, params is being updated because it’s a mutable object. When you pass a mutable object to a python function and make changes to it inside the function, they persist.

However, when you’re creating a tuple, zzz = z,zz, Python copies the values of z and zz into zzz. This is because z and zz refer to integers which are immutable. You can verify this for yourself by using the id function which returns the address of an object. As an exercise, try id(z), id(zz), id(zzz[0]) and id(zzz[1]). You should see that they are all different.

However, the situation changes when you’re dealing with mutable objects. I encourage you to try the following in jupyter or the python CLI

a = 2
b = 3
d = { "animal" : "dog" }
id(a), id(b), id(d)

t = (a, b, d)

# Notice the address of `t[2]`. Compare it to the address of `d`
print(id( t[0] ), id( t[1] ), id( t[2] ))

d['age'] = 10
print(t)

d = { "colour" : "red" }
# Is `d` pointing to the same location or a new one now?
print(id(d))
# What about `t[2]`?
print(t, id(t[2]))

There’s lots more detail online if you google python pass by object reference.

Thank you very much for kind explanation. it is very helpful