HI there,
I’m trying to load an exported learner created on Google Colab, for inference on my local machine. This works no problem on Colab, but when I try on my local with this:
learn_inf = load_learner(path/‘ct_pe_whole_model’, cpu=True)
I get the following error:
ModuleNotFoundError Traceback (most recent call last)
in
1 path = Path()
----> 2 learn_inf = load_learner(path/‘ct_pe_whole_model’, cpu=True)
~/anaconda3/envs/fastaiv2/lib/python3.6/site-packages/fastai/learner.py in load_learner(fname, cpu)
545 "Load a Learner
object in fname
, optionally putting it on the cpu
"
546 distrib_barrier()
–> 547 res = torch.load(fname, map_location=‘cpu’ if cpu else None)
548 if hasattr(res, ‘to_fp32’): res = res.to_fp32()
549 if cpu: res.dls.cpu()
~/anaconda3/envs/fastaiv2/lib/python3.6/site-packages/torch/serialization.py in load(f, map_location, pickle_module, **pickle_load_args)
582 opened_file.seek(orig_position)
583 return torch.jit.load(opened_file)
–> 584 return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
585 return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
586
~/anaconda3/envs/fastaiv2/lib/python3.6/site-packages/torch/serialization.py in _load(zip_file, map_location, pickle_module, **pickle_load_args)
840 unpickler = pickle_module.Unpickler(data_file, **pickle_load_args)
841 unpickler.persistent_load = persistent_load
–> 842 result = unpickler.load()
843
844 return result
ModuleNotFoundError: No module named ‘spacy.lookups’
Any idea what the issue might be? For what it’s worth I can do the load_learner function on a vision learner, no problem.
Any help would be much appreciated