Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Sktime | 6,496 | 6 hours ago | 714 | bsd-3-clause | Python | |||||
A unified framework for machine learning with time series | ||||||||||
Darts | 5,921 | 7 | a day ago | 25 | June 22, 2022 | 234 | apache-2.0 | Python | ||
A python library for user-friendly forecasting and anomaly detection on time series. | ||||||||||
Autogluon | 5,771 | 2 days ago | 241 | apache-2.0 | Python | |||||
AutoGluon: AutoML for Image, Text, Time Series, and Tabular Data | ||||||||||
Gluonts | 3,571 | 7 | 7 hours ago | 58 | June 30, 2022 | 359 | apache-2.0 | Python | ||
Probabilistic time series modeling in Python | ||||||||||
Tsai | 3,495 | 1 | a day ago | 41 | April 19, 2022 | 32 | apache-2.0 | Jupyter Notebook | ||
Time series Timeseries Deep Learning Machine Learning Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai | ||||||||||
Informer2020 | 3,421 | 2 months ago | 39 | apache-2.0 | Python | |||||
The GitHub repository for the paper "Informer" accepted by AAAI 2021. | ||||||||||
Neural_prophet | 2,954 | 4 days ago | 7 | March 22, 2022 | 39 | mit | Python | |||
NeuralProphet: A simple forecasting package | ||||||||||
Merlion | 2,921 | 2 months ago | 14 | June 28, 2022 | 14 | bsd-3-clause | Python | |||
Merlion: A Machine Learning Framework for Time Series Intelligence | ||||||||||
Pytorch Forecasting | 2,856 | 4 | a day ago | 33 | May 23, 2022 | 390 | mit | Python | ||
Time series forecasting with PyTorch | ||||||||||
Statsforecast | 2,583 | 5 | 14 hours ago | 11 | June 27, 2022 | 78 | apache-2.0 | Python | ||
Lightning ⚡️ fast forecasting with statistical and econometric models. |
This repo is the official Pytorch implementation of LTSF-Linear: "Are Transformers Effective for Time Series Forecasting?".
[2022/11/23] Accepted to AAAI 2023 with three strong accept! We also release a benchmark for long-term time series forecasting for further research.
[2022/08/25] We update our paper with comprehensive analyses on why existing LTSF-Transformers do not work well on the LTSF problem!
[2022/08/25] Besides DLinear, we're exicted to add two Linear models to the paper and this repo. Now we have a LTSF-Linear family!
[2022/08/25] We update some scripts of LTSF-Linear.
Beside LTSF-Linear, we provide five significant forecasting Transformers to re-implement the results in the paper.
We provide all experiment script files in ./scripts
:
| Files | Interpretation |
| ------------- | -------------------------------------------------------|
| EXP-LongForecasting | Long-term Time Series Forecasting Task |
| EXP-LookBackWindow | Study the impact of different look-back window sizes |
| EXP-Embedding | Study the effects of different embedding strategies |
This code is simply built on the code base of Autoformer. We appreciate the following GitHub repos a lot for their valuable code base or datasets:
The implementation of Autoformer, Informer, Transformer is from thuml/Autoformer
The implementation of FEDformer is from MAZiqing/FEDformer
The implementation of Pyraformer is from alipay/Pyraformer
LTSF-Linear is a set of linear models.
Although LTSF-Linear is simple, it has some compelling characteristics:
Univariate Forecasting:
Multivariate Forecasting:
LTSF-Linear outperforms all transformer-based methods by a large margin.
Comparison of method efficiency with Look-back window size 96 and Forecasting steps 720 on Electricity. MACs are the number of multiply-accumulate operations. We use DLinear for comparison, since it has the double cost in LTSF-Linear. The inference time averages 5 runs.
First, please make sure you have installed Conda. Then, our environment can be installed by:
conda create -n LTSF_Linear python=3.6.9
conda activate LTSF_Linear
pip install -r requirements.txt
You can obtain all the nine benchmarks from Google Drive provided in Autoformer. All the datasets are well pre-processed and can be used easily.
mkdir dataset
Please put them in the ./dataset
directory
scripts/
, we provide the model implementation Dlinear/Autoformer/Informer/Transformer
FEDformer/scripts/
, we provide the FEDformer implementationPyraformer/scripts/
, we provide the Pyraformer implementationFor example:
To train the LTSF-Linear on Exchange-Rate dataset, you can use the scipt scripts/EXP-LongForecasting/Linear/exchange_rate.sh
:
sh scripts/EXP-LongForecasting/Linear/exchange_rate.sh
It will start to train DLinear by default, the results will be shown in logs/LongForecasting
. You can specify the name of the model in the script. (Linear, DLinear, NLinear)
All scripts about using LTSF-Linear on long forecasting task is in scripts/EXP-LongForecasting/Linear/
, you can run them in a similar way. The default look-back window in scripts is 336, LTSF-Linear generally achieves better results with longer look-back window as dicussed in the paper.
Scripts about look-back window size and long forecasting of FEDformer and Pyraformer is in FEDformer/scripts
and Pyraformer/scripts
, respectively. To run them, you need to first cd FEDformer
or cd Pyraformer
. Then, you can use sh to run them in a similar way. Logs will store in logs/
.
Each experiment in scripts/EXP-LongForecasting/Linear/
takes 5min-20min. For other Transformer scripts, since we put all related experiments in one script file, directly running them will take 8 hours-1 day. You can keep the experiments you interested in and comment out the others.
As shown in our paper, the weights of LTSF-Linear can reveal some charateristic of the data, i.e., the periodicity. As an example, we provide the weight visualization of DLinear in weight_plot.py
. To run the visualization, you need to input the model path (model_name) of DLinear (the model directory in ./checkpoint
by default). To obtain smooth and clear patterns, you can use the initialization we provided in the file of linear models.
If you find this repository useful for your work, please consider citing it as follows:
@inproceedings{Zeng2022AreTE,
title={Are Transformers Effective for Time Series Forecasting?},
author={Ailing Zeng and Muxi Chen and Lei Zhang and Qiang Xu},
journal={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2023}
}
Please remember to cite all the datasets and compared methods if you use them in your experiments.