Multi-Step Vector Output Prediction of Time Series Using EMA LSTM
DOI:
https://doi.org/10.15575/join.v8i1.1037Keywords:
Deep learning, EMA, LSTM, Multi-step vector output, Time series predictionAbstract
This research paper proposes a novel method, Exponential Moving Average Long Short-Term Memory (EMA LSTM), for multi-step vector output prediction of time series data using deep learning. The method combines the LSTM with the exponential moving average (EMA) technique to reduce noise in the data and improve the accuracy of prediction. The research compares the performance of EMA LSTM to other commonly used deep learning models, including LSTM, GRU, RNN, and CNN, and evaluates the results using statistical tests. The dataset used in this study contains daily stock market prices for several years, with inputs of 60, 90, and 120 previous days, and predictions for the next 20 and 30 days. The results show that the EMA LSTM method outperforms other models in terms of accuracy, with lower RMSE and MAPE values. This study has important implications for real-world applications, such as stock market forecasting and climate prediction, and highlights the importance of careful preprocessing of the data to improve the performance of deep learning models.
References
R. Chandra, S. Goyal, and R. Gupta, “Evaluation of Deep Learning Models for Multi-Step Ahead Time Series Prediction,” IEEE Access, vol. 9, pp. 83105–83123, 2021, doi: 10.1109/access.2021.3085085.
L. Yunpeng, H. Di, B. Junpeng, and Q. Yong, “Multi-step Ahead Time Series Forecasting for Different Data Patterns Based on LSTM Recurrent Neural Network,” in 2017 14th Web Information Systems and Applications Conference (WISA), IEEE, Nov. 2017. doi: 10.1109/wisa.2017.25.
N. H. An and D. T. Anh, “Comparison of Strategies for Multi-step-Ahead Prediction of Time Series Using Neural Network,” in 2015 International Conference on Advanced Computing and Applications (ACOMP), IEEE, Nov. 2015. doi: 10.1109/acomp.2015.24.
M. Diqi, “StockTM: Accurate Stock Price Prediction Model Using LSTM,” ijicom, vol. 4, no. 1, p. 1, Oct. 2022, doi: 10.35842/ijicom.v4i1.50.
D. M. Ahmed, M. M. Hassan, and R. J. Mstafa, “A Review on Deep Sequential Models for Forecasting Time Series Data,” Applied Computational Intelligence and Soft Computing, vol. 2022, pp. 1–19, Jun. 2022, doi: 10.1155/2022/6596397.
P. Lara-Ben’ itez, M. Carranza-Garc’ ia, and J. e C. Riquelme, “An Experimental Review on Deep Learning Architectures for Time Series Forecasting,” International Journal of Neural Systems, vol. 31, no. 03, p. 2130001, Feb. 2021, doi: 10.1142/s0129065721300011.
J. e F. Torres, D. Hadjout, A. Sebaa, F. Mart’ inez-’ Alvarez, and A. Troncoso, “Deep Learning for Time Series Forecasting: A Survey,” Big Data, vol. 9, no. 1, pp. 3–21, Feb. 2021, doi: 10.1089/big.2020.0159.
M. Diqi, S. H. Mulyani, and R. Pradila, “DeepCov: Effective Prediction Model of COVID-19 Using CNN Algorithm,” SN Computer Science, vol. 4, no. 4, p. 396, May 2023, doi: 10.1007/s42979-023-01834-w.
Z. Shen, Y. Zhang, J. Lu, J. Xu, and G. Xiao, “A novel time series forecasting model with deep learning,” Neurocomputing, vol. 396, pp. 302–313, Jul. 2020, doi: 10.1016/j.neucom.2018.12.084.
L. Li, Y. Ou, Y. Wu, Q. Li, and D. Chen, “Research on feature engineering for time series data mining,” in 2018 International Conference on Network Infrastructure and Digital Content (IC-NIDC), IEEE, Aug. 2018. doi: 10.1109/icnidc.2018.8525561.
R. Ye and Q. Dai, “Implementing transfer learning across different datasets for time series forecasting,” Pattern Recognition, vol. 109, p. 107617, Jan. 2021, doi: 10.1016/j.patcog.2020.107617.
A. R. S. Parmezan, V. M. A. Souza, and G. E. A. P. A. Batista, “Evaluation of statistical and machine learning models for time series prediction: Identifying the state-of-the-art and the best conditions for the use of each model,” Information Sciences, vol. 484, pp. 302–337, May 2019, doi: 10.1016/j.ins.2019.01.076.
M. Russwurm, M. Ali, X. X. Zhu, Y. Gal, and M. Korner, “Model and Data Uncertainty for Satellite Time Series Forecasting with Deep Recurrent Models,” in IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium, IEEE, Sep. 2020. doi: 10.1109/igarss39084.2020.9323890.
L. Li, J. Yan, X. Yang, and Y. Jin, “Learning Interpretable Deep State Space Model for Probabilistic Time Series Forecasting,” in Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, International Joint Conferences on Artificial Intelligence Organization, Aug. 2019. doi: 10.24963/ijcai.2019/402.
S. Zhang, Y. Chen, W. Zhang, and R. Feng, “A novel ensemble deep learning model with dynamic error correction and multi-objective ensemble pruning for time series forecasting,” Information Sciences, vol. 544, pp. 427–445, Jan. 2021, doi: 10.1016/j.ins.2020.08.053.
A. Rosato, F. Succetti, R. Araneo, A. Andreotti, M. Mitolo, and M. Panella, “A Combined Deep Learning Approach for Time Series Prediction in Energy Environments,” in 2020 IEEE/IAS 56th Industrial and Commercial Power Systems Technical Conference (I&CPS), IEEE, Jun. 2020. doi: 10.1109/icps48389.2020.9176818.
A. Xuan, M. Yin, Y. Li, X. Chen, and Z. Ma, “A comprehensive evaluation of statistical, machine learning and deep learning models for time series prediction,” in 2022 7th International Conference on Data Science and Machine Learning Applications (CDMA), IEEE, Mar. 2022. doi: 10.1109/cdma54072.2022.00014.
L. Zhu and N. Laptev, “Deep and Confident Prediction for Time Series at Uber,” in 2017 IEEE International Conference on Data Mining Workshops (ICDMW), IEEE, Nov. 2017. doi: 10.1109/icdmw.2017.19.
E. Terzi, L. Fagiano, M. Farina, and R. Scattolini, “Learning multi-step prediction models for receding horizon control,” in 2018 European Control Conference (ECC), 2018, pp. 1335–1340. doi: 10.23919/ECC.2018.8550494.
T. McElroy and M. W. McCracken, “Multistep ahead forecasting of vector time series,” Econometric Reviews, vol. 36, no. 5, pp. 495–513, 2017, doi: 10.1080/07474938.2014.977088.
Y. Bao, T. Xiong, and Z. Hu, “Multi-step-ahead time series prediction using multiple-output support vector regression,” Neurocomputing, vol. 129, pp. 482–493, 2014, doi: https://doi.org/10.1016/j.neucom.2013.09.010.
S. J. Mirkamali, “An improved exponentially weighted moving average chart for monitoring proportions using maxima nomination sampling,” Journal of Statistical Computation and Simulation, vol. 91, no. 2, pp. 282–299, Sep. 2020, doi: 10.1080/00949655.2020.1813294.
N. Khan, M. Aslam, and C.-H. Jun, “A EWMA Control Chart for Exponential Distributed Quality Based on Moving Average Statistics,” Quality and Reliability Engineering International, vol. 32, no. 3, pp. 1179–1190, Jul. 2015, doi: 10.1002/qre.1825.
H. W. You, M. B. C. Khoo, P. Castagliola, and L. Qu, “Optimal exponentially weighted moving average charts with estimated parameters based on median run length and expected median run length,” International Journal of Production Research, vol. 54, no. 17, pp. 5073–5094, Feb. 2016, doi: 10.1080/00207543.2016.1145820.
" Umit Çavu? B" uy" uk?ahin and ?. Ertekin, “Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition,” Neurocomputing, vol. 361, pp. 151–163, Oct. 2019, doi: 10.1016/j.neucom.2019.05.099.
M. Sarkar and A. De Bruyn, “LSTM Response Models for Direct Marketing Analytics: Replacing Feature Engineering with Deep Learning,” Journal of Interactive Marketing, vol. 53, pp. 80–95, Feb. 2021, doi: 10.1016/j.intmar.2020.07.002.
M. H. Adalat, R. Azmi, and J. Bagherinejad, “An Enhanced LSTM Method to Improve the Accuracy of the Business Process Prediction,” Journal of Industrial Management Perspective, vol. 10, no. 3, pp. 71–97, Oct. 2020, doi: 10.29252/jimp.10.3.71.
L. Chen, Y. Chi, Y. Guan, and J. Fan, “A Hybrid Attention-Based EMD-LSTM Model for Financial Time Series Prediction,” in 2019 2nd International Conference on Artificial Intelligence and Big Data (ICAIBD), IEEE, May 2019. doi: 10.1109/icaibd.2019.8837038.
C. Sun, Y. Nong, Z. Chen, D. Liang, Y. Lu, and Y. Qin, “The CEEMD-LSTM-ARIMA Model and Its Application in Time Series Prediction,” Journal of Physics: Conference Series, vol. 2179, no. 1, p. 012012, Jan. 2022, doi: 10.1088/1742-6596/2179/1/012012.
J. Li, B. Yang, H. Li, Y. Wang, C. Qi, and Y. Liu, “DTDR–ALSTM: Extracting dynamic time-delays to reconstruct multivariate data for improving attention-based LSTM industrial time series prediction models,” Knowledge-Based Systems, vol. 211, 2021, doi: 10.1016/j.knosys.2020.106508.
Q. Gu, Q. Dai, H. Yu, and R. Ye, “Integrating Multi-Source Transfer Learning, Active Learning and Metric Learning paradigms for Time Series Prediction,” Applied Soft Computing, vol. 109, 2021, doi: 10.1016/j.asoc.2021.107583.
Downloads
Published
Issue
Section
Citation Check
License
Copyright (c) 2023 MOHAMMAD DIQI; Ahmad Sahal, Farida Nur Aini

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
You are free to:
- Share — copy and redistribute the material in any medium or format for any purpose, even commercially.
- The licensor cannot revoke these freedoms as long as you follow the license terms.
Under the following terms:
-
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
-
NoDerivatives — If you remix, transform, or build upon the material, you may not distribute the modified material.
-
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
Notices:
- You do not have to comply with the license for elements of the material in the public domain or where your use is permitted by an applicable exception or limitation.
- No warranties are given. The license may not give you all of the permissions necessary for your intended use. For example, other rights such as publicity, privacy, or moral rights may limit how you use the material.
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License