LSTM Sequence length

Greetings to everyone!

I want to ask if there is an optimal sequence length of a LSTM network in general, or in terms of time series prediction problems?
I read about vanishing gradient or exploding gradient problems that very long RNN networks had and LSTM tried to solve and succeeded to a certain extent.
I also heard about techniques to handle very large sequences with LSTM’s and RNN’s in general like: truncating sequences, summarizing sequences, truncating backpropagation through time or even using an Encoder-Decoder architecture.
I asked this question because I didn’t find a research paper about this, only a blog post that stated an optimal sequence length between 10-30, here:

Thanks in advance!
Have a nice day!