Having this issue with my model when I run
learn.export()
Can't pickle local object 'Learner.get_preds.<locals>.<lambda>'
AttributeError Traceback (most recent call last)
<ipython-input-108-fa5b61306ef3> in <module>
----> 1 learn.export()
/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/learner.py in export(self, fname, pickle_protocol)
503 #To avoid the warning that come from PyTorch about model not being checked
504 warnings.simplefilter("ignore")
--> 505 torch.save(self, self.path/fname, pickle_protocol=pickle_protocol)
506 self.create_opt()
507 if state is not None: self.opt.load_state_dict(state)
/opt/conda/envs/fastai/lib/python3.7/site-packages/torch/serialization.py in save(obj, f, pickle_module, pickle_protocol, _use_new_zipfile_serialization)
326
327 with _open_file_like(f, 'wb') as opened_file:
--> 328 _legacy_save(obj, opened_file, pickle_module, pickle_protocol)
329
330
/opt/conda/envs/fastai/lib/python3.7/site-packages/torch/serialization.py in _legacy_save(obj, f, pickle_module, pickle_protocol)
399 pickler = pickle_module.Pickler(f, protocol=pickle_protocol)
400 pickler.persistent_id = persistent_id
--> 401 pickler.dump(obj)
402
403 serialized_storage_keys = sorted(serialized_storages.keys())
AttributeError: Can't pickle local object 'Learner.get_preds.<locals>.<lambda>'
1 Like
muellerzr
(Zachary Mueller)
April 4, 2020, 11:36pm
2
As said many times before, we need to know how to reproduce it. How did you generate your data, learner. etc…
1 Like
In this case the learner is this
I used the resnet101 this time.
But I am having the same issue also with the resnet18
Now that I restarted the notebook it works.
lgvaz
(Lucas Goulart Vazquez)
April 5, 2020, 12:47am
6
You cannot pickle lambda functions.
Use the normal function syntax: def func(....):
2 Likes
Thanks @lgvaz ut I don’t see any lambda functions in my notebook. With clear output solved the issue, probably I ran some labda function in the past.
abharani
(abhishek bharani)
April 5, 2020, 3:25am
8
Same issue for me as well and is not getting resolved after restarting.
I just re-ran the lesson2 notebook with my keys. Could you please help me resolve this issue.
1 Like
abharani
(abhishek bharani)
April 5, 2020, 3:34am
10
yes i did restart and clear output.
1 Like
post here your entire code so we can help you better and try to reproduce on our end
abharani
(abhishek bharani)
April 5, 2020, 3:39am
12
Well its the 02_production-personal.ipynb notebook code. except i just added my keys.
here is the public link to the notebook
https://www.paperspace.com/teajcojhg/notebook/pr3a0489d
1 Like
I was trying to export also the model from the petbreeds and I got a similar issue
AttributeError Traceback (most recent call last)
<ipython-input-19-fa5b61306ef3> in <module>
----> 1 learn.export()
/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/learner.py in export(self, fname, pickle_protocol)
503 #To avoid the warning that come from PyTorch about model not being checked
504 warnings.simplefilter("ignore")
--> 505 torch.save(self, self.path/fname, pickle_protocol=pickle_protocol)
506 self.create_opt()
507 if state is not None: self.opt.load_state_dict(state)
/opt/conda/envs/fastai/lib/python3.7/site-packages/torch/serialization.py in save(obj, f, pickle_module, pickle_protocol, _use_new_zipfile_serialization)
326
327 with _open_file_like(f, 'wb') as opened_file:
--> 328 _legacy_save(obj, opened_file, pickle_module, pickle_protocol)
329
330
/opt/conda/envs/fastai/lib/python3.7/site-packages/torch/serialization.py in _legacy_save(obj, f, pickle_module, pickle_protocol)
399 pickler = pickle_module.Pickler(f, protocol=pickle_protocol)
400 pickler.persistent_id = persistent_id
--> 401 pickler.dump(obj)
402
403 serialized_storage_keys = sorted(serialized_storages.keys())
AttributeError: Can't pickle local object 'Learner.get_preds.<locals>.<lambda>'
I am not using lambda but it seems that Learner.get_preds is using it.
[ EDIT - 05/12/2020 ] I cc @sgugger because I did not receive an answer about my question on the increasing size of the pkl file when the size batch is larger. However, I think that it is a significant problem,
Hello. I am using this thread for another issue regarding learn.export()
(at least, one question I have).
The size of the pkl file created by learn.export()
depends on the batch size (at least in my test, it depends on large batch size).
I do not understand why. Any idea?
My notebook is online in github and the following table shows my results (sizes of pkl files by batch size from 8 to 512). The smallest pkl file (bs from 8 to 128) has a size of 52M and the biggest (bs of 512) of 272M.
1 Like
It’s probably because the .xb
.yb
and .pred
that contain the last batch (inputs, target, predictions) are saved with the Learner. I’m dealing with the edits on the book rn so I don’t have time to investigate more though.
3 Likes
Thank you Sylvain for your answer.
In order not to forget it, I opened an issue in the fastai v2 github.
1 Like
Hello Sylvain.
In my ubuntu installation of fastai2, I ran git pull
to get the updated files you changed and I got them.
After I ran again my notebook about batch size vs learn.export()
pkl file and there was no change:
bs = 512 | export.pkl = 285 MB
bs = 8 | export.pkl = 54.2 MB
Can you re-opened this issue ?
1 Like