Time Series Forecasting

Everything seems to work again. I tested it on colab and it there it seems fine. Thanks for the patience.
@jwithing @swell

Awesome I tried it out and it works great. With n-beats is it possible to use multivariate time series? I’ve tried looking for examples and haven’t really found any

Can you share your example of how you did it on some dummy dataset. Thanks in advance

https://colab.research.google.com/github/takotab/fastseq/blob/master/nbs/index.ipynb

1 Like

No N-Beats is not designed for multi-variate.

I’m working on another approach (much more basic) around M5 but it’s still work in progress.

Thanks! I look forward to checking it out

Hi, I am amazed to see forecasting implementation in fastai2. I ran the ‘index’ or ‘overview’ notebook of fastseq without any problem. But, I couldn’t understand,

  1. Train - Validation - Test split strategy
  2. Why season = lookback+horizon is taken as hyper-parameter for nbeats_learner?
  3. Why the lr_find plot is not getting it’s loss increased rapidly after a certain learning rate(lr)?
  4. Does the learn.fit_flat_cos is different learning strategy compared to fit_one_cycle?

It would be very helpful if you include this pieces of information in the documentation of overview.

Thank you.

1 Like

Hi, very good questions. I’ll try to answer them as best as possible in the docs. However here already a couple of quick links to keep you moving:

Priyatham10:

Train - Validation - Test split strategy

The little documentation there is you can find here. I’ll try to find time to give a bit more examples.

Priyatham10:

Why season = lookback+horizon is taken as hyper-parameter for nbeats_learner ?

Season is the maximum Period for the SeasonalityBlock. There are more examples in the link. The default setting worked best for my data but it does help to tweak that one.

Priyatham10:

Why the lr_find plot is not getting it’s loss increased rapidly after a certain learning rate(lr) ?

No idea. I could speculate but I have not investigated the matter.

Priyatham10:

Does the learn.fit_flat_cos is different learning strategy compared to fit_one_cycle ?

Originally I also used the fit_one_cycle, but with the succes of Mish with fit_flat_cos i decided to give it a shot. (Imagenette/ Imagewoof Leaderboards) It did better only introduced more depencies that I wasn’t willing to put up with. In the end I removed that part. But I forgot to remove it from the Readme. I’m not sure if it still helps with the relu as activation. Here is a link to official documentation:
https://dev.fast.ai/callback.schedule#Learner.fit_flat_cos

2 Likes

Hi,

Thank you so much for the explanations. I tried the same approach on the airline-passengers dataset. But, the results are not satisfying. Is the architecture implementation here in Fastseq is currently capable of giving best results yet? I thought work is under progress to give state-of-the art results. If the state-of-the art implementation is already implemented for forecasting, please point me in that direction.

Thank you and stay safe…

I think so but it also depends on the parameters. It helps to play with the seasonality parameter but the others also impact results.

Could you share a collab of your implementation?

1 Like

Okay fine. I am sharing the colab link. Please tell me where I’m wrong in the hyper-parameters so that we can get good results.
Thank you…
https://colab.research.google.com/drive/10IZubH3_KYQdZVpIjMcugkbip3RwDCBU

1 Like

Hi,
We have the N-BEATS paper. But, can we have a good comprehensive explanation of

  1. Why it outperforms the top performer of M4 competition?
  2. It’s architecture insights that made so we can understand the logical explanations and then we can link to the math behind it?
1 Like

or you can have a look at this kaggle notebook that have everything you need
https://www.kaggle.com/init27/fastai-v3-lesson-6-rossman

Thank you very much @farid for giving a great overview of how fast.ai can be used with time series data :wink: :+1: Very helpful!