Document Type

Conference Proceeding

Publication Date



Transformers have achieved great success in the task of time series long sequence forecasting (TLSF) in recent years. However, existing research has pointed out that over-parameterized deep learning models are in favor of low frequency and could be difficult to capture high-frequency information for regression fitting task, named spectral bias. Yet the effect of such bias on TLSF problem, an auto-regressive problem with a long forecasting length, has not been explored. In this work, we take the first step to investigate the spectral bias issues in TLSF task for state-of-the-art models. Specifically, we carefully examine three different existing time series Transformers on the task of TLSF with both synthetic and real-world data and visualize their behavior on spectrum. We show that spectral bias exists in the problem of TLSF. Surprisingly, our experiment demonstrated that the model bias behavior, whether it favors at high or low frequencies, is heavily influenced by the model design of the individual Transformer.



To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.