When I trained the model, I did it using gpu and a batch size of 2048, the learner was saved using export. Seeing the fastai docs, the method load_model has the parameter location but no load_learner. I would like to know why is that so?
With the code as it’s now, the learner is not loaded in the gpu and the predictions are not calculated using the GPU but the CPU. How should I save and load the model to do that? Should I use save_model and load_learner instead? Can I use test_dl if that’s the case?
Thanks for replying Zachary. load_learner doesn’t have a device parameter. If I add it, I will get this error load_learner() got an unexpected keyword argument 'device'.
By seeing ( _set_device), I realized what was the issue. Setting the default device, the predictions are calculated in the chosen device. In load learner the param cpu is False as @sinhak suggested.