Error message:
RuntimeError: Expected tensor for argument #1 ‘indices’ to have scalar type Long; but got torch.FloatTensor instead (while checking arguments for embedding)
(When calling ‘learner.fit_one_cycle(1, 1e-3)’)

I use a concated model including both tabular and text data. No error either without text data or use seperate model for text data.

The last part of the collate function(x4 is text data):
x4, y = pad_collate(list(zip(x4, y)), pad_idx=1, pad_first=True)
x4 = to_data(x4) # (this line was not used at first and the error occur in both cases)
return (x1,x2,x3,x4), y

I’ve been trying to fix the bug myself for many hours but failed. Some solutions found online were .data[0](seems only work for tensor with one value) and .numpy() (can not use a numpy array here)

No one can help you without seeing the whole code you’re using and the full error message. The problem can come from your data, your model or your loss function and we’re not magicians

The error message means that at some point PyTorch expected a tensor of type float and got a tensor of type int, which could come from any of the things I spelled earlier.