Moving inference to the CPU

You model is on the CPU but your data is automatically on the GPU, that’s why you have this error. Either don’t put the model on the CPU or change the default device to default.device = torch.device('cpu').

4 Likes