How to save databunch object

Hey everyone,
I tried saving my databunch object it gives me ctype error, does anyone know how can we save an object detection databunch object containing images and bbox labels?
Here is the code I tried for saving the databunch:

train, valid = ObjectItemListSlide(train_images) ,ObjectItemListSlide(valid_images)
item_list = ItemLists(".", train, valid)
lls = item_list.label_from_func(lambda x: x.y, label_cls=SlideObjectCategoryList)
lls = lls.transform(tfms, tfm_y=True, size=patch_size)
data = lls.databunch(bs=batch_size, collate_fn=bb_pad_collate,num_workers=0).normalize()
data.save("test_data_bs8.pkl")

The error:

ValueError                                Traceback (most recent call last)
<ipython-input-22-88cd1dd3e50b> in <module>
----> 1 data.save("test_data_bs8.pkl")

3 frames
/usr/local/lib/python3.7/dist-packages/fastai/basic_data.py in save(self, file)
    153             warn("Serializing the `DataBunch` only works when you created it using the data block API.")
    154             return
--> 155         try_save(self.label_list, self.path, file)
    156 
    157     def add_test(self, items:Iterator, label:Any=None, tfms=None, tfm_y=None)->None:

/usr/local/lib/python3.7/dist-packages/fastai/torch_core.py in try_save(state, path, file)
    414             #To avoid the warning that come from PyTorch about model not being checked
    415             warnings.simplefilter("ignore")
--> 416             torch.save(state, target)
    417     except OSError as e:
    418         raise Exception(f"{e}\n Can't write {path/file}. Pass an absolute writable pathlib obj `fname`.")

/usr/local/lib/python3.7/dist-packages/torch/serialization.py in save(obj, f, pickle_module, pickle_protocol, _use_new_zipfile_serialization)
    378         if _use_new_zipfile_serialization:
    379             with _open_zipfile_writer(opened_file) as opened_zipfile:
--> 380                 _save(obj, opened_zipfile, pickle_module, pickle_protocol)
    381                 return
    382         _legacy_save(obj, opened_file, pickle_module, pickle_protocol)

/usr/local/lib/python3.7/dist-packages/torch/serialization.py in _save(obj, zip_file, pickle_module, pickle_protocol)
    587     pickler = pickle_module.Pickler(data_buf, protocol=pickle_protocol)
    588     pickler.persistent_id = persistent_id
--> 589     pickler.dump(obj)
    590     data_value = data_buf.getvalue()
    591     zip_file.write_record('data.pkl', data_value, len(data_value))

ValueError: ctypes objects containing pointers cannot be pickled

I also tried torch.save it gives me the same error.
How can I save my data so that I use the same data batches each time I run my model?