Load Learner - AttributeError: Can't get attribute 'CrossEntropyLoss Flat' on <module 'fastai.layers'

This issue started in a comment of another similar one, but decided to expand it to its own thread. I have been working with the FastAI book for the past few days, and built a model following the guide in the FastBook Chapter 2 (Lesson 2/3) with success initially. The model works, and at first exporting and loading led to no problems. However after playing around with dispatching it onto Github, altering for Voila, I ended up running consistently into this error with no luck (I cannot even load anymore):

AttributeError                            Traceback (most recent call last)
<ipython-input-5-7726f78efb69> in <module>
----> 1 learn_inf = load_learner(path/'export.pkl')

/opt/conda/envs/fastai/lib/python3.8/site-packages/fastai/learner.py in load_learner(fname, cpu)
551     "Load a `Learner` object in `fname`, optionally putting it on the `cpu`"
552     distrib_barrier()
--> 553     res = torch.load(fname, map_location='cpu' if cpu else None)
554     if hasattr(res, 'to_fp32'): res = res.to_fp32()
555     if cpu: res.dls.cpu()

/opt/conda/envs/fastai/lib/python3.8/site-packages/torch/serialization.py in load(f, map_location, pickle_module, **pickle_load_args)
592                     opened_file.seek(orig_position)
593                     return torch.jit.load(opened_file)
--> 594                 return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
595         return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)

/opt/conda/envs/fastai/lib/python3.8/site-packages/torch/serialization.py in _load(zip_file, map_location, pickle_module, pickle_file, **pickle_load_args)
851     unpickler = pickle_module.Unpickler(data_file, **pickle_load_args)
852     unpickler.persistent_load = persistent_load
--> 853     result = unpickler.load()
855     torch._utils._validate_loaded_sparse_tensors()

AttributeError: Can't get attribute 'CrossEntropyLossFlat' on <module 'fastai.layers' from '/opt/conda/envs/fastai/lib/python3.8/site-packages/fastai/layers.py'>

At this point I don’t know what else to do, I have scoured the forums, asked around and tried digging into the relationship with dependencies myself with no luck. Mind you all of this was built and tested out in the past 24 hours, the issues arose maybe a few hours back, so it is not like this is an outdated version. Any and all help is appreciated.


Unpickling (which happens in load_learner) requires everything used by the object loaded to be present in the current namespace. That means that you should make sure you use the same imports with the same fastai version when you save the model and when you load the model, and if you use any custom functions then they should also be present in both.

Can you share the code where you save the model and load the model with us, including the fastai versions?


hey @orendar, it turned out that the conflict was that for some reason the version of FastAI that trained the model was different from the version used to load the learner. Resetting everything and making sure the versions were the same solved the problem (y).