TypeError: params argument given to the optimizer should be an iterable of Tensors or dicts, but got torch.FloatTensor

I’m trying to run the official RAdam implementation using a similar style to the excel sheet that Jeremy put together. I’ve been able to get it working properly up to Adam.

This is what I’m running: https://gist.github.com/kevinbird15/4575936d0f3610c6e4871850d67ecdc5

and this is the error that is being returned:

-------------------
TypeErrorTraceback (most recent call last)
<ipython-input-35-1c025307cc35> in <module>
      1 theta = torch.tensor([1.0,1.0], requires_grad=True)
----> 2 optim = RAdam(theta, 1, betas=(0.9, 0.95), eps=1e-5, weight_decay=0)
      3 #optim = RAdam(model.parameters(), lr=args.lr, betas=(args.beta1, args.beta2), weight_decay=args.weight_decay)
      4 for x in x_list:
      5     if theta.grad is not None: theta.grad.zero_()

<ipython-input-34-d2883a2ccc2c> in __init__(self, params, lr, betas, eps, weight_decay, degenerated_to_sgd)
     21                     param['buffer'] = [[None, None, None] for _ in range(10)]
     22         defaults = dict(lr=lr, betas=betas, eps=eps, weight_decay=weight_decay, buffer=[[None, None, None] for _ in range(10)])
---> 23         super(RAdam, self).__init__(params, defaults)
     24 
     25     def __setstate__(self, state):

~/anaconda3/envs/fastai/lib/python3.8/site-packages/torch/optim/optimizer.py in __init__(self, params, defaults)
     35 
     36         if isinstance(params, torch.Tensor):
---> 37             raise TypeError("params argument given to the optimizer should be "
     38                             "an iterable of Tensors or dicts, but got " +
     39                             torch.typename(params))

TypeError: params argument given to the optimizer should be an iterable of Tensors or dicts, but got torch.FloatTensor

So it doesn’t look like it allows my params to be a single tensor, but I’m not sure how to modify my small example to work as the weights instead of putting a model in here. Any help would be much appreciated.

I found the answer right after posting this. I needed to wrap theta in brackets so my line looks like this:

optim = RAdam([theta], 1, betas=(0.9, 0.95), eps=1e-5, weight_decay=0)