This paper was accepted by NeurIPS 2024 TSALM Workshop. We provide the open-source code here.
This is the official repository for "Unveiling the Potential of Text in High-Dimensional Time Series Forecasting" (Accepted by NeurIPS-24 TSALM Workshop) [Paper]
🌟 If you find this work helpful, please consider to star this repository and cite our research:
@inproceedings{xin2024textfusionhts,
author = {Zhou, Xin and Wang, Weiqing and Qu, Shilin and Zhiqiang, Zhang and Bergmeir, Christoph},
title = {Unveiling the Potential of Text in High-Dimensional Time Series Forecasting},
booktitle = {Proceedings of the NeurIPS 2024 Workshop on Time Series in the Age of Large Models (TSALM)},
year = {2024}
}
Please access the well pre-processed Wiki-People and News datasets from [Google Drive], then place the downloaded contents under the corresponding folders of /datasets
- Clone this repository
git clone [email protected]:xinzzzhou/TextFusionHTS.git
cd TextFusionHTS
- Config environment
conda create --name textfusionhts python=3.8
conda activate textfusionhts
pip3 install -r requirements.txt
pip3 install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113
- Download datasets and place them under the corresponding folders of
/datasets
- Embedding learning, please note: there are required login Hugging Face, please replace the
xxxxxxxxxx
with your own Hugging Face token.
python emb_learning.py
- Train and test the model. We provide two main.py files for demonstration purposes under the root folder.
python main_wiki.py
Drop_last will influence the number of data windows in the end. To achieve a fair comparison, we didn't use drop_last for testing.
- We gratefully acknowledge the support of Google for providing a travel grant, which enabled attendance at NeurIPS 2024. This funding has significantly contributed to our research efforts and facilitated essential academic exchange. We sincerely appreciate Google's generous support.
- Our implementation adapts Time-Series-Library as the code base and has extensively modified it to our purposes. We thank the authors for sharing their implementations and related resources.