Multiprocessing error with databunch

I am trying to run Roberta in Fastai with celery. It gives me an error at the databunch -

Code -
data = RobertaTextList.from_df(df, “.”, cols=feat_cols, processor=processor)
.label_from_df(cols=label_cols, label_cls=CategoryList)
.databunch(, pad_first=False, pad_idx=0)

RobertaTextList is a custom function -
class RobertaTextList(TextList):
_bunch = RobertaDataBunch
_label_cls = TextList

Error -

Traceback (most recent call last):
File “/usr/local/lib/python3.7/site-packages/celery/app/”, line 385, in trace_task
R = retval = fun(*args, **kwargs)
File “/usr/local/lib/python3.7/site-packages/celery/app/”, line 650, in protected_call
return*args, **kwargs)
File “”, line 179, in run_alerts
alerts = alerts_roberta(alert_df)
File “/”, line 166, in alerts_roberta
.label_from_df(cols=label_cols, label_cls=CategoryList)
File “/usr/local/lib/python3.7/site-packages/fastai/”, line 484, in _inner
File “/usr/local/lib/python3.7/site-packages/fastai/”, line 538, in process
for ds,n in zip(self.lists, [‘train’,‘valid’,‘test’]): ds.process(xp, yp, name=n)
File “/usr/local/lib/python3.7/site-packages/fastai/”, line 718, in process
File “/usr/local/lib/python3.7/site-packages/fastai/”, line 84, in process
for p in self.processor: p.process(self)
File “/usr/local/lib/python3.7/site-packages/fastai/text/”, line 297, in process
tokens += self.tokenizer.process_all(ds.items[i:i+self.chunksize])
File “/usr/local/lib/python3.7/site-packages/fastai/text/”, line 120, in process_all
return sum(, partition_by_cores(texts, self.n_cpus)), [])
File “/usr/local/lib/python3.7/concurrent/futures/”, line 671, in map
File “/usr/local/lib/python3.7/concurrent/futures/”, line 587, in map
fs = [self.submit(fn, *args) for args in zip(*iterables)]
File “/usr/local/lib/python3.7/concurrent/futures/”, line 587, in
fs = [self.submit(fn, *args) for args in zip(*iterables)]
File “/usr/local/lib/python3.7/concurrent/futures/”, line 641, in submit
File “/usr/local/lib/python3.7/concurrent/futures/”, line 583, in _start_queue_management_thread
File “/usr/local/lib/python3.7/concurrent/futures/”, line 607, in _adjust_process_count
File “/usr/local/lib/python3.7/multiprocessing/”, line 110, in start
‘daemonic processes are not allowed to have children’
AssertionError: daemonic processes are not allowed to have children

Can someone please help me with this?