Export learner for multi-label classification fails

I’m facing trouble exporting a learner where I consistently get this error:

TypeError                                 Traceback (most recent call last)
<ipython-input-28-4a59c4bc697c> in <module>()
      5                    EarlyStoppingCallback(learn, monitor='valid_loss', min_delta=0.01, patience=3)]
      6 learn.fit_one_cycle(epochs, slice(lr))
----> 7 learn.export(f'{PATH_OUTPUT}/{model_name}_export2.pkl')
      8 

5 frames
/usr/local/lib/python3.6/dist-packages/torch/serialization.py in _save(obj, f, pickle_module, pickle_protocol)
    295     pickler = pickle_module.Pickler(f, protocol=pickle_protocol)
    296     pickler.persistent_id = persistent_id
--> 297     pickler.dump(obj)
    298 
    299     serialized_storage_keys = sorted(serialized_storages.keys())

TypeError: cannot serialize '_io.TextIOWrapper' object

The error is related to the use of callbacks, i.e. when using callbacks the error occurs and when removing the callback, I can export the model without any errors. I’m using fast.ai version 1.0.53. I’m actually trying to find a way to export the best performing model.

A bit more context. I’m building a multi-label classification model where key code elements are:

tfms = get_transforms(
    flip_vert=False, 
    max_rotate=45, 
    max_lighting=0.4, 
    max_zoom=1.3, 
    max_warp=0.1, 
    p_affine=.75,
)
df = pd.read_csv(f'{PATH_DATA}/{csv_name}')
data = ImageDataBunch.from_csv(
    path=PATH_DATA, 
    folder='test', 
    csv_labels=csv_name,
    label_col=labels,
    valid_pct=0.2,
    ds_tfms=tfms, 
    size=IMAGE_SZ, 
    bs=bs).normalize(imagenet_stats)

acc_05 = partial(accuracy_thresh, thresh=0.5)
f_score = partial(fbeta, thresh=0.5)
learn = cnn_learner(data, arch, metrics=[acc_05, f_score])

lr = 1e-2
learn.callbacks = [ShowGraph(learn), 
                   SaveModelCallback(learn, name=f'{PATH_OUTPUT}/{model_name}_best_model_stage1'), 
                   CSVLogger(learn, filename=f'{model_name}_history_stage1'),
                   EarlyStoppingCallback(learn, monitor='valid_loss', min_delta=0.01, patience=3)]
learn.fit_one_cycle(epochs, slice(lr))

learn.export(f'{PATH_OUTPUT}/{model_name}_export2.pkl')

This is the callback causing the issue, it’s not pickle-able. You should use it as callback_fn: pass callback_fn=partial(CSVLogger, filename=f'{model_name}_history_stage1') in your call to Learner and your issue will disappear.

3 Likes

@sgugger What a blunder. Didn’t notice the change in API (code use to work some months ago). Thank you so much for fast response.

On fastai V2 I got around this by removing the logger before exporting:
learn.remove_cb(cb.CSVLogger)
learn.export(model_file+’.pkl’)

3 Likes

Thanks @nburnett
Worked for me without referring to the cb object: learn.remove_cb(CSVLogger).

1 Like