Time series/ sequential data study group

I’d like to share with you a new self-supervised callback I’ve added to the tsai library. It’s called TSBERT.

It allows you to pretrain any time series model in a self-supervised manner, ie. without labels. You can then fine-tune or train on a labeled dataset. It’s based on the “A Transformer-based Framework for Multivariate Time Series Representation Learning” paper.

I’ve tested it on a few datasets and it seems to work pretty well. Here are some results:

Screen Shot 2021-01-13 at 12.47.59 Screen Shot 2021-01-13 at 12.47.39

I’ve also added a notebook to demonstrate how it works.

This implementation can be used with any time series model (whether a transformer or not). In the notebook, for example, I’ve used InceptionTime.

I’d encourage you to use it. It’s very easy to use!

10 Likes