"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

February 01, 2023

Research paper Read - Are Transformers Effective for Time Series Forecasting?

 Are Transformers Effective for Time Series Forecasting?

  • Time series are ubiquitous in today’s data-driven world.
  • Given historical data, time series forecasting (TSF) is a long-standing task that has a wide range of applications
  • Long-term time series forecasting (LTSF)
  • Simple one-layer linear models named LTSF-Linear for comparison
  • The main working power of Transformers is from its multi-head self-attention mechanism
  • Extensive experiments on nine widely-used benchmark datasets that cover various real-life applications: traffic, energy, economics, weather, and disease predictions.
  • LTSF-Linear outperforms existing complex Transformerbased models in all cases, and often by a large margin (20% - 50%).

  • The difference between the original sequence and the trend component is regarded as the seasonal component

  • Transformer-based methods: FEDformer [31], Autoformer [28], Informer [30], Pyraformer [18], and LogTrans [16].
  • FEDformer in most cases by 20% ∼ 50% improvements on the multivariate forecasting
  • FEDformer employs classical time series analysis techniques such as frequency processing, which brings in time series inductive bias and benefits the ability of temporal feature extraction

  • Recurrent neural networks (RNNs) based methods (e.g., [21]) summarize the past information compactly in internal memory states and recursively update themselves for forecasting.
  • Convolutional neural networks (CNNs) based methods (e.g., [3]), wherein convolutional filters are used to capture local temporal features.

Keep Exploring!!!

No comments: