In lesson 4, we create a basicOptim that takes in the models params, and sets it to self.params.
In the step and zero grad functions, opt updates self.params.
But the models params are never set back to the opt.params, so how does the model know when opt has updated the params? Am I missing something here?
They’re both references to the same underlying tensor - it gets referenced in multiple places, not copied.
Thank you, I thought that was what was happening but I tried to replicate it with int variables and a class with similar structure, and the values got copied rather than referencing the same int variable. Then after some digging I realised that when objects are assigned to one another, they reference the same underlying object, but when variables are assigned to one another, they’re copies. I had a hard time internalising this, but I think I get it now.