Error when passing a tensor to conv1d

Hello! I want to implement the code from the GAN lecture to a 1D data, instead of 2D images. So for each input I have a 1D tensor, with 500 entries and I want to pass it to a conv1d. After I put it into a mini batch of size 10 and I call x.size() I get this output: torch.Size([10, 500]). However the conv1d gives an error, requiring a 3D tensor, so I tried to add one more dimension, corresponding to the number of input channels (which is 1) so I did this: x = x[:,None,:] and the output of x.size() is this: torch.Size([10, 1, 500]) which is what I want. However, when I pass it again to the conv1d I get this error: TypeError: argument 0 is not a Variable. I updated my pytorch, conda and fastai (I did “git pull”) but I get the same error. What should I do? Thank you!

1 Like

(Apologies if this isn’t right; I’m a relative novice.)

If I recall correctly from the lectures, making something a Variable is what lets PyTorch keep track of the operations made to it so that it can calculate the gradient for us. Have you tried wrapping your tensor in a Variable()?

Try to debug it for your use case like in the Pytorch documentation:

m = nn.Conv1d(16, 33, 3, stride=2)
input = torch.randn(20, 16, 50)
output = m(input)

Input size (N,Cin,L): N is a batch size, C denotes a number of channels, L is a length of signal sequence.

Your error makes me wonder if you properly instanced nn.Conv1d.

(As far as I know with Pytorch 1.0 the Variable() setup is not used anymore.)

1 Like

@MicPie - looks like you’re right! Found this:

On setting .requires_grad = True tensors start forming a backward graph that tracks every operation applied on them to calculate the gradients using something called a dynamic computation graph (DCG) (explained further in the post).

In earlier versions of PyTorch, the torch.autograd.Variable class was used to create tensors that support gradient calculations and operation tracking but as of PyTorch v0.4.0 Variable class has been deprecated. torch.Tensor and torch.autograd.Variable are now the same class. More precisely, torch.Tensor is capable of tracking history and behaves like the old Variable

via

1 Like