Time Series Forecasting

Well, I’m still doing something wrong. Somehow I am not installing the correct packages. I think this is the issue because I am popping an error on not having nbdev. So then I add !pip install nbdev and then get another package error and so on until I get to TSDataLoader which can’t be !pip install

!curl -s https://course.fast.ai/setup/colab | bash

from google.colab import drive
drive.mount('/content/gdrive', force_remount=True)
root_dir = "/content/gdrive/My Drive/"
base_dir = root_dir + 'siteprediction/'

!pip install fastai2

!git clone https://github.com/takotab/fastseq.git
%cd fastseq
!pip install -e .

from fastai import * from fastai2.basics import * from fastseq.all import * from fastseq.nbeats.model import * from fastseq.nbeats.learner import * from fastseq.nbeats.callbacks import *

do you have any idea why the training/validation loss is nan? Although I tried to fill in the missing values manually - without going through FillMissing, I still got NaN.

Can someone point me to an example or blog post for multivariate time series forecasting using fastai, wherein we can pass in other categorical column like day of week as well …

I looked in the fastseq example but that is a univariate example. I have 2 months of data and I need to predict for next fifteen days.

I’m having the same issue where I cant get past the tsdataloader import. Have you had any luck fixing it?

You might check out Amazon Labs’ time series forecasting repo called GluonTS .

GluonTS uses Amazon MXNet (instead of Pytorch or TensorFlow). They implemented many state-of-the-art architectures ( DeepFactor, DeepAR, DeepState, GP Forecaster, GP Var, LST Net, N-BEATS, NPTS, Prophet, R Forecast, seq2seq, Simple FeedForward, Transformer, Trivial, and WaveNet). Many of them (DeepFactor, DeepAR, DeepState) also use categorical data (covariate variables) and use probabilistic forecasting

4 Likes

@takotab indicates that he needed to do some more work to get it supported on Conan. I’m sure he’ll get to it!

Everything seems to work again. I tested it on colab and it there it seems fine. Thanks for the patience.
@jwithing @swell

Awesome I tried it out and it works great. With n-beats is it possible to use multivariate time series? I’ve tried looking for examples and haven’t really found any

Can you share your example of how you did it on some dummy dataset. Thanks in advance

https://colab.research.google.com/github/takotab/fastseq/blob/master/nbs/index.ipynb

1 Like

No N-Beats is not designed for multi-variate.

I’m working on another approach (much more basic) around M5 but it’s still work in progress.

Thanks! I look forward to checking it out

Hi, I am amazed to see forecasting implementation in fastai2. I ran the ‘index’ or ‘overview’ notebook of fastseq without any problem. But, I couldn’t understand,

  1. Train - Validation - Test split strategy
  2. Why season = lookback+horizon is taken as hyper-parameter for nbeats_learner?
  3. Why the lr_find plot is not getting it’s loss increased rapidly after a certain learning rate(lr)?
  4. Does the learn.fit_flat_cos is different learning strategy compared to fit_one_cycle?

It would be very helpful if you include this pieces of information in the documentation of overview.

Thank you.

1 Like

Hi, very good questions. I’ll try to answer them as best as possible in the docs. However here already a couple of quick links to keep you moving:

Priyatham10:

Train - Validation - Test split strategy

The little documentation there is you can find here. I’ll try to find time to give a bit more examples.

Priyatham10:

Why season = lookback+horizon is taken as hyper-parameter for nbeats_learner ?

Season is the maximum Period for the SeasonalityBlock. There are more examples in the link. The default setting worked best for my data but it does help to tweak that one.

Priyatham10:

Why the lr_find plot is not getting it’s loss increased rapidly after a certain learning rate(lr) ?

No idea. I could speculate but I have not investigated the matter.

Priyatham10:

Does the learn.fit_flat_cos is different learning strategy compared to fit_one_cycle ?

Originally I also used the fit_one_cycle, but with the succes of Mish with fit_flat_cos i decided to give it a shot. (Imagenette/ Imagewoof Leaderboards) It did better only introduced more depencies that I wasn’t willing to put up with. In the end I removed that part. But I forgot to remove it from the Readme. I’m not sure if it still helps with the relu as activation. Here is a link to official documentation:
https://dev.fast.ai/callback.schedule#Learner.fit_flat_cos

2 Likes

Hi,

Thank you so much for the explanations. I tried the same approach on the airline-passengers dataset. But, the results are not satisfying. Is the architecture implementation here in Fastseq is currently capable of giving best results yet? I thought work is under progress to give state-of-the art results. If the state-of-the art implementation is already implemented for forecasting, please point me in that direction.

Thank you and stay safe…

I think so but it also depends on the parameters. It helps to play with the seasonality parameter but the others also impact results.

Could you share a collab of your implementation?

1 Like

Okay fine. I am sharing the colab link. Please tell me where I’m wrong in the hyper-parameters so that we can get good results.
Thank you…
https://colab.research.google.com/drive/10IZubH3_KYQdZVpIjMcugkbip3RwDCBU

1 Like

Hi,
We have the N-BEATS paper. But, can we have a good comprehensive explanation of

  1. Why it outperforms the top performer of M4 competition?
  2. It’s architecture insights that made so we can understand the logical explanations and then we can link to the math behind it?
1 Like

or you can have a look at this kaggle notebook that have everything you need
https://www.kaggle.com/init27/fastai-v3-lesson-6-rossman

Thank you very much @farid for giving a great overview of how fast.ai can be used with time series data :wink: :+1: Very helpful!