Document Type
Conference Proceeding
Publication Date
2023
Abstract
Transformers have achieved great success in the task of time series long sequence forecasting (TLSF) in recent years. However, existing research has pointed out that over-parameterized deep learning models are in favor of low frequency and could be difficult to capture high-frequency information for regression fitting task, named spectral bias. Yet the effect of such bias on TLSF problem, an auto-regressive problem with a long forecasting length, has not been explored. In this work, we take the first step to investigate the spectral bias issues in TLSF task for state-of-the-art models. Specifically, we carefully examine three different existing time series Transformers on the task of TLSF with both synthetic and real-world data and visualize their behavior on spectrum. We show that spectral bias exists in the problem of TLSF. Surprisingly, our experiment demonstrated that the model bias behavior, whether it favors at high or low frequencies, is heavily influenced by the model design of the individual Transformer.
Recommended Citation
Ackaah-Gyasi, Kofi Nketia, Sergio Valdez, Yifeng Gao, and Li Zhang. 2023. “Exploring Spectral Bias in Time Series Long Sequence Forecasting.”