If self.u is Embedding it should be a matrix of weights, not a function. Trying the following code:
def get_emb(ni,nf):
e = nn.Embedding(ni, nf)
e.weight.data.uniform_(-0.01,0.01)
return e
items = cf.items
users = cf.users
(u, m) = [get_emb(*o) for o in [(len(users), n_factors), (len(items), n_factors)]]
type(u)
returns torch.nn.modules.sparse.Embedding
then testing with
u(users)
throws an error
--------------------------------
AttributeErrorTraceback (most recent call last)
<ipython-input-29-b376461b23c3> in <module>()
----> 1 u(users)
~/anaconda2/envs/fastai/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
323 for hook in self._forward_pre_hooks.values():
324 hook(self, input)
--> 325 result = self.forward(*input, **kwargs)
326 for hook in self._forward_hooks.values():
327 hook_result = hook(self, input, result)
~/anaconda2/envs/fastai/lib/python3.6/site-packages/torch/nn/modules/sparse.py in forward(self, input)
101 input, self.weight,
102 padding_idx, self.max_norm, self.norm_type,
--> 103 self.scale_grad_by_freq, self.sparse
104 )
105
~/anaconda2/envs/fastai/lib/python3.6/site-packages/torch/nn/_functions/thnn/sparse.py in forward(cls, ctx, indices, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
37 ctx.sparse = sparse
38
---> 39 assert indices.dim() <= 2
40 assert not ctx.needs_input_grad[0], "Embedding doesn't " \
41 "compute the gradient w.r.t. the indices"
AttributeError: 'numpy.ndarray' object has no attribute 'dim'
An Embedding isn’t actually a matrix of weights, it’s a layer (a nn.Module). Layers are callable. Doing self.u(users) returns a tensor containing embeddings for each of the users (that is, it has the same shape as users, plus the size of each embedding tacked onto the end).
As for your error, I can’t actually get the notebook to run at the moment, but what is your users argument? That error makes me wonder if it’s a numpy array when it’s supposed to be a pytorch tensor.