MNIST with SDG lesson Jupyter Notebook error

Hi, I’m trying to learn SGD by typing the code on Jupyter with Anaconda. I’m at the SGD lession where we find the fit for 20 velocity plot with gradient descent. When I use loop to repeat the gradient descent with this code,

def apply_step(params, prn=True):
  preds = f(time, params)
  loss = mse(preds, speed)
  loss.backward() -= lr *
  params.grad = None
  if prn: print(loss.item())
  return preds

for i in range(10): 

I’m getting this errror:
RuntimeError Traceback (most recent call last)
1 for i in range(10):
----> 2 apply_step(params)

<ipython-input-223-02369e3dad61> in apply_step(params, prn)
      2     preds = f(time, params)
      3     loss = mse(preds, speed)
----> 4     loss.backward()
      5 -= lr *
      6     params.grad = None

~\anaconda3\lib\site-packages\torch\ in backward(self, gradient, retain_graph, create_graph)
    183                 products. Defaults to ``False``.
    184         """
--> 185         torch.autograd.backward(self, gradient, retain_graph, create_graph)
    187     def register_hook(self, hook):

~\anaconda3\lib\site-packages\torch\autograd\ in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables)
    123         retain_graph = create_graph
--> 125     Variable._execution_engine.run_backward(
    126         tensors, grad_tensors, retain_graph, create_graph,
    127         allow_unreachable=True)  # allow_unreachable flag

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling backward the first time.

I looked into different solution through googling, and someone suggested using .detach_() function, but I don’t know how to implement that in this code. Could someone help me out with this?

Much appreciated!

Hi, I found this discussion that can help you :

The function in itself looks good so I would suspect your mistake is before. Are you trying to save intermediary results ? Then, the second call to backward could come from there.