Doing Batch Inference on Unknown Test Batch at Runtime

Hi all,
I’m currently working on doing batch inference and the way that I’ve been doing that it to set up a databunch and pass it to my load_learner object like:
learn = load_learner(path=Path('/.'), file='export', test=ImageList.from_folder(f'testing/')) per the instructions at https://docs.fast.ai/tutorial.inference.html#A-classification-problem .
Is there a way for me to load the learner once, like this:
earn = load_learner(path=Path('/.'), file='export'
and at a later point add a test datatbunch to it?
This should make my inference much faster since I wouldn’t need to load the learner from disk for each test databunch and it could just be left in memory and would also allow batching as opposed to the regular learn.predict() method which doesn’t seem to allow batching.
Thanks