Inference: storage has wrong size: expected 0 got 256

I recently trained a new model from an old one like so:

I used learn.export() to export my model and downloaded it to a separate server for production. When I use learn = load_learner(Path("path")) to load my model, I get
RuntimeError: storage has wrong size: expected 0 got 256 Note that I cannot import it using load_learner on the server that trained it either.
Things I tried:
I made sure that pytorch, torchvision, and fastai version were the same on both servers.
I tried re-exporting the model and transferring it again to the server

I tried training another sample model then using the same method to add a class to it and I get the same error but it’s a different “got” number.

I am having the same problem, on local server the model wroks fine.
But on remote linux machine server, it is giving the same error

Are you really sure about versions? I remember to have had a similar issue, due to some difference between fastai versions that gave different export formats, binary incompatible from one version to the other.