Input length for sequence models, how to train QA models on longer texts?

In most code I have seen we specify the input length and we pad if the length is less than the specified length and truncate if its longer. When we are doing QA with sequence models how can we deal with longer texts? How can we truncate the string if the answer is after the specified length? I have used allen NLP and that library can give answers from long texts, I tried asking questions on a whole video transcript and it gives correct results at least it is able to give results back, there was no preprocessing I just gave the string as input to the model. So is there a way that I am not aware of to train sequence models on longer texts or to do inference on longer texts?
Thank you.