I’m using fastai v1 for a segmentation project and trying to integrate a trained inference learner into its own package. I put together a small python file (Inference.py) that defines the custom loss functions I used in my model, and defines a single class for calling load_learner and predicting on numpy input.
When I import Inference in a new jupyter notebook and use the class to call load_learner, it works fine, but I’m finding when I call it from another python package (trying to integrate into a broader project) it fails…
It seems like when I load the pickled model with load_learner, it’s not seeing the custom loss functions I defined (and looking for them in main), even though they are defined in Inference.py
Has anyone else run into anything similar or have thoughts?
File "/packages/shatterstar/shatterstar/shatterstar.py", line 44, in detection_function_fastai_resnet_unet
lm = LoadModel(path_mdl, "export.pkl")
File "/packages/shatterstar/shatterstar/Inference.py", line 62, in __init__
self.learn = load_learner(path,mdl)
File "/anaconda3/envs/pFastai/lib/python3.6/site-packages/fastai/basic_train.py", line 614, in load_learner
state = torch.load(source, map_location='cpu') if defaults.device == torch.device('cpu') else torch.load(source)
File "/anaconda3/envs/pFastai/lib/python3.6/site-packages/torch/serialization.py", line 426, in load
return _load(f, map_location, pickle_module, **pickle_load_args)
File "/anaconda3/envs/pFastai/lib/python3.6/site-packages/torch/serialization.py", line 613, in _load
result = unpickler.load()
AttributeError: Can't get attribute 'combo_loss' on <module '__main__' from '/anaconda3/envs/pFastai/bin/shatterstar'>