Detailed Introduction
SeqFusion, a novel framework that collects and fuses diverse pre-trained models (PTMs) sequentially for zero-shot forecasting without collecting diverse pre-training data.
Based on the specific temporal characteristics of the target time series, SeqFusion selects the most suitable PTMs for your data, performs sequential predictions, and fuses all the predictions while using minimal data to protect privacy. Experiments demonstrate that SeqFusion achieves competitive
accuracy in zero-shot forecasting compared to state-of-the-art methods
In this repo, you can figure out:
- Achieving SOTA zero-shot forecasting performance with a few lightweight pre-trained models.
- Implementations of Pre-trained Model Selection for time-series forecasting, and enjoy its user-friendly inference capabilities.
- Feel free to customize the application scenarios of SeqFusion!
- Table of Contents
- Zero-shot Forecasting Performance
- Code Implementation
- Reproduce for Other Baseline Methods
- Contributing
Performance comparisons of 3 kinds of baseline approaches and SeqFusion on 7 multivariate datasets with MSE. We denote the best-performing results in bold.
Methods | Resource Type | ECL | ETTh1 | ETTh2 | Exchange | Illness | Traffic | Weather | Memory Storage (MB) (Data + Model) |
---|---|---|---|---|---|---|---|---|---|
Last | - | 0.7360 | 0.7640 | 0.2639 | 0.0217 | 4.7867 | 2.2498 | 1.4799 | - |
Mean | 0.6755 | 0.6134 | 0.30376 | 0.0376 | 4.8981 | 1.3565 | 1.4063 | - | |
SeasonalNaive | 0.6091 | 0.8539 | 0.3315 | 0.0272 | 6.0760 | 1.2227 | 1.6105 | - | |
Arima | In-Task Data | 3.6648 | 0.6389 | 1.0048 | 10.1624 | 5.8628 | 2.4790 | 3.1264 | 0.01 + 30.27 |
Prophet | 10.2358 | 6.1366 | 10.1677 | 229.8594 | 9.1147 | 3.8610 | 2.9049 | 0.01 + 3.270 | |
Transformer | 1.3429 | 0.6875 | 0.9457 | 1.5532 | 5.0526 | 1.9362 | 2.1727 | 0.01 + 64.06 | |
Autoformer | 0.8861 | 0.8519 | 0.5835 | 0.1950 | 4.5547 | 1.4316 | 1.7660 | 0.01 + 65.88 | |
FEDformer | 0.9156 | 0.7561 | 0.4061 | 0.2478 | 4.6087 | 1.5551 | 1.6792 | 0.01 + 66.93 | |
Informer | 1.3743 | 0.7870 | 0.8497 | 1.5969 | 5.3082 | 2.1612 | 2.3070 | 0.01 + 67.07 | |
DLinear | 0.6942 | 0.6732 | 0.3470 | 0.0559 | 3.5083 | 1.3655 | 1.4644 | 0.01 + 0.55 | |
PatchTST | 0.6184 | 0.7333 | 0.4006 | 0.0544 | 3.9034 | 1.1661 | 1.4877 | 0.01 + 27.17 | |
iTransformer | 0.6067 | 0.7183 | 0.3345 | 0.0315 | 3.5232 | 1.1306 | 1.5676 | 0.01 + 26.15 | |
Meta-N-BEATS | Pre-Train Data | 0.7576 | 0.7175 | 0.0469 | 4.6405 | 2.2361 | 1.4648 | 1.4488 | 1.70 + 95.85 |
GPT4TS | 0.7548 | 0.6961 | 0.3397 | 0.0226 | 3.7603 | 1.4777 | 1.4777 | 1.70 + 74.83 | |
ForecastPFN | 0.9511 | 1.1851 | 0.5144 | 0.0579 | 4.8880 | 1.7894 | 1.8770 | * + 23.50 | |
SeqFusion | PTMs | 0.6029 | 0.6001 | 0.2450 | 0.0217 | 3.4956 | 1.4889 | 1.4488 | 0.02 + 23.10 |
Table: Performance comparison of various zero-shot forecasting methods across different datasets, including classical statistical methods,deep learning models trained on 50 in-task timesteps data and zero-shot methods requiring pre-training data.
More results can be found in the paper.
-
Set up the environment (Please make sure the torch version is compatible with the GPU):
git clone https://github.com/Tingji2419/SeqFusion.git cd SeqFusion pip install -r requirements.txt
-
Download the data and pre-trained model zoo for SeqFusion here, and then unzip them directly.
-
Run the code for SeqFusion:
bash command_benchmark1.sh
-
Collect SeqFusion results:
python check_result.sh
The results will be displayed on the screen.
Coming soon.
SeqFusion is currently in active development, and we warmly welcome any contributions aimed at enhancing capabilities. Whether you have insights to share regarding pre-trained models, data, or innovative ranking methods, we eagerly invite you to join us in making SeqFusion even better.