Possible bug in databunch save

I have recently been building language models. I like to prep the databunch once, and then save. I have noticed that the bptt value I pass when building the databunch isn’t saved. For example:

data = TextLMDataBunch.from_df(path, corpus_train, corpus_valid, bs=bs, tokenizer=tok, text_cols='text', min_freq=1, bptt=100, include_bos=False, include_eos=False)
bb = data.one_batch()
bb[0].size()
torch.Size([64, 100])
data.save('databunch-hyper-bptt-100.pkl')
data = load_data(path, 'databunch-hyper-bptt-100.pkl')
bb = data.one_batch()
bb[0].size()
torch.Size([64, 70])

Has anyone else noticed this behavior? Is it possible to specify the bptt value after load_data? I do not see a way to access it.

I am currently running fastai version 1.0.58