Weights[0] in-place operations

Hello everyone, I’m stuck at lesson 4 with a error on the following line:

weights[0] *= 1.0001

I get the following error:

RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation.

From the pytorch forum:

PyTorch doesn’t allow in-place operations on leaf variables that have requires_grad=True (such as parameters of your model) because the developers could not decide how such an operation should behave.

So, how are you people doing that?
Thank you

Hi Midst

Which page in the book. Weights is a matrix. Try weights.shape because weights[0] seems wrong.

Regards Conwyn

1 Like

hi, this should work

     with torch.no_grad():
         weights[0] *= 1.0001
7 Likes

it is in the The MNIST Loss Function paragraph. Thank you Conwyn

Hi Mugnaio, I tried and it works thanks. Any idea why this no gradient isn’t specified in the lesson?

1 Like

Hi Midst

You can report it here

Regards Conwyn

Hi,
I tried:

weights[0].data *= 1.0001

and it also works
BR,
Raviv

3 Likes

I met the same problem and solved by this post.
but can anybody help to explaine why we can not do this “weights[0] *= 1.0001” directly?
thank you