TimeMixer-ME: A Dual-Memory Enhanced Architecture for Time Series Forecasting
DOI:
https://doi.org/10.54097/g0504662Keywords:
Dual Memory-enhanced Architecture, Cognitive Science, Adaptive Multi-scale Modeling, Short-term PredictionAbstract
Time series prediction, as a research field with extensive practical applications, consistently faces a key challenge: how to improve the model's overall predictive capability while maintaining short-term prediction accuracy. To address this challenge, this research innovatively proposes TimeMixer-ME, a dual-memory enhanced architecture that ingeniously integrates cognitive science's dual memory system theory with deep learning techniques, achieving adaptive multi-scale modeling of time series. Through experimental validation on multiple weather datasets from Southern Xinjiang, TimeMixer-ME demonstrates exceptional predictive performance. Compared to the baseline TimeMixer model, it achieved a significant reduction of 2%-2.4% in MSE metrics and a remarkable improvement of 3.7%-4.2% in MAE metrics. These results convincingly demonstrate the model's excellent performance in short-term prediction tasks, providing a novel approach to addressing the trade-off between long-term and short-term dependencies in time series prediction.
Downloads
References
[1] Box, G. E. P., Gwilym M. Jenkins, Gregory C. Reinsel and Greta M. Ljung. “Time Series Analysis: Forecasting and Control.” The Statistician 27 (1978): 265-265.
[2] Hyndman, Rob J and George Athanasopoulos. “Forecasting: principles and practice.” (2013).
[3] Hochreiter, Sepp and Jürgen Schmidhuber. “Long Short-Term Memory.” Neural Computation 9 (1997): 1735-1780.
[4] Vaswani, Ashish, Noam M. Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin. “Attention is All you Need.” Neural Information Processing Systems (2017).
[5] Zhou, Tian, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun and Rong Jin. “FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting.” International Conference on Machine Learning (2022).
[6] Zeng, Ailing, Mu-Hwa Chen, L. Zhang and Qiang Xu. “Are Transformers Effective for Time Series Forecasting?” AAAI Conference on Artificial Intelligence (2022).
[7] Liu, Yong, Haixu Wu, Jianmin Wang and Mingsheng Long. “Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting.” Neural Information Processing Systems (2022).
[8] Wu, Haixu, Teng Hu, Yong Liu, Hang Zhou, Jianmin Wang and Mingsheng Long. “TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis.” ArXiv abs/2210.02186 (2022): n. pag.
[9] Shabani, Amin, Amir Hossein Sayyad Abdi, Li Meng and Tristan Sylvain. “Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting.” ArXiv abs/2206.04038 (2022): n. pag.
[10] Lin, Lequan, Zhengkun Li, Ruikun Li, Xuliang Li and Junbin Gao. “Diffusion models for time-series applications: a survey.” Frontiers of Information Technology & Electronic Engineering 25 (2023): 19-41.
[11] Liu, Yong, Chenyu Li, Jianmin Wang and Mingsheng Long. “Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors.” ArXiv abs/2305.18803 (2023): n. pag.
[12] Lee, Hyunwoo, Seungmin Jin, Hyeshin Chu, Hong Sik Lim and Sungahn Ko. “Learning to Remember Patterns: Pattern Matching Memory Networks for Traffic Forecasting.” ArXiv abs/2110.10380 (2021): n. pag.
[13] Liu, Xiangyue, Xinqi Lyu, Xiangchi Zhang, Jianliang Gao and Jiamin Chen. “Memory Augmented Graph Learning Networks for Multivariate Time Series Forecasting.” Proceedings of the 31st ACM International Conference on Information & Knowledge Management (2022): n. pag.
[14] Zhang, Yunhao and Junchi Yan. “Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting.” International Conference on Learning Representations (2023).
[15] Wang, Shiyu, Haixu Wu, Xiao Long Shi, Tengge Hu, Huakun Luo, Lintao Ma, James Y. Zhang and Jun Zhou. “TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting.” ArXiv abs/2405.14616 (2024): n. pag.
[16] Xu, Luwen, Jiwei Qin, Dezhi Sun, Yuanyuan Liao and Jiong Zheng. “PFformer: A Time-Series Forecasting Model for Short-Term Precipitation Forecasting.” IEEE Access 12 (2024): 130948-130961.
[17] Woo, Gerald, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese and Doyen Sahoo. “Unified Training of Universal Time Series Forecasting Transformers.” ArXiv abs/2402.02592 (2024): n. pag.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Frontiers in Computing and Intelligent Systems

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

