When running the cifar10-darknet notebook, I was getting this error:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
I had previously worked off the video and did not have the issue. It appears to be a result of the line that is commented out. In the video, there was a discussion about trying to save memory and work on things in place and x.add_
was added at the time. Using the original line (that is above the commented out line) will work.
class ResLayer(nn.Module):
def __init__(self, ni):
super().__init__()
self.conv1=conv_layer(ni, ni//2, ks=1)
self.conv2=conv_layer(ni//2, ni, ks=3)
def forward(self, x):
return x.add(self.conv2(self.conv1(x)))
# return x.add_(self.conv2(self.conv1(x)))
Updated: As Nikhil suggests below, simply taking off the underscore after add will make it not be an in-place operation and directly addresses the error message.