PSO based Hyperparameter tuning of CNN Multivariate Time- Series Analysis


  • Agung Bella Putra Utama Department of Electrical Engineering, Universitas Negeri Malang, Indonesia
  • Aji Prasetya Wibawa Department of Electrical Engineering, Universitas Negeri Malang, Indonesia
  • Muladi Muladi Department of Electrical Engineering, Universitas Negeri Malang, Indonesia
  • Andrew Nafalski UniSA Education Futures, School of Engineering, University of South Australia, Australia



CNN, Hyperparameter tuning, Multivariate time-series, PSO


Convolutional Neural Network (CNN) is an effective Deep Learning (DL) algorithm that solves various image identification problems. The use of CNN for time-series data analysis is emerging. CNN learns filters, representations of repeated patterns in the series, and uses them to forecast future values. The network performance may depend on hyperparameter settings. This study optimizes the CNN architecture based on hyperparameter tuning using Particle Swarm Optimization (PSO), PSO-CNN. The proposed method was evaluated using multivariate time-series data of electronic journal visitor datasets. The CNN equation in image and time-series problems is the input given to the model for processing numbers. The proposed method generated the lowest RMSE (1.386) with 178 neurons in the fully connected and 2 hidden layers. The experimental results show that the PSO-CNN generates an architecture with better performance than ordinary CNN.


P. Schratz, J. Muenchow, E. Iturritxa, J. Richter, and A. Brenning, “Hyperparameter tuning and performance assessment of statistical and machine-learning algorithms using spatial data,†Ecol. Modell., vol. 406, pp. 109–120, Aug. 2019, doi: 10.1016/j.ecolmodel.2019.06.002.

N. DeCastro-García, Ã. L. Muñoz Castañeda, D. Escudero García, and M. V. Carriegos, “Effect of the Sampling of a Dataset in the Hyperparameter Optimization Phase over the Efficiency of a Machine Learning Algorithm,†Complexity, vol. 2019, pp. 1–16, Feb. 2019, doi: 10.1155/2019/6278908.

L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,†Neurocomputing, vol. 415, pp. 295–316, Nov. 2020, doi: 10.1016/j.neucom.2020.07.061.

F. Hutter, L. Kotthoff, and J. Vanschoren, Automated Machine Learning. Cham: Springer International Publishing, 2019.

G. Luo, “A review of automatic selection methods for machine learning algorithms and hyper-parameter values,†Netw. Model. Anal. Heal. Informatics Bioinforma., vol. 5, no. 1, p. 18, Dec. 2016, doi: 10.1007/s13721-016-0125-6.

D. Maclaurin, D. Duvenaud, and R. P. Adams, “Gradient-based Hyperparameter Optimization through Reversible Learning,†Proc. 32nd Int. Conf. Mach. Learn., vol. 37, 2015, [Online]. Available:

B. H. Shekar and G. Dagnew, “Grid Search-Based Hyperparameter Tuning and Classification of Microarray Cancer Data,†in 2019 Second International Conference on Advanced Computational and Communication Paradigms (ICACCP), Feb. 2019, pp. 1–8, doi: 10.1109/ICACCP.2019.8882943.

L. Zahedi, F. G. Mohammadi, S. Rezapour, M. W. Ohland, and M. H. Amini, “Search Algorithms for Automated Hyper-Parameter Tuning,†pp. 1–10, 2021, [Online]. Available:

J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization,†J. Mach. Learn. Res., vol. 13, pp. 281–305, 2012.

L. Li and A. Talwalkar, “Random search and reproducibility for neural architecture search,†35th Conf. Uncertain. Artif. Intell. UAI 2019, 2019.

B. Bischl et al., “Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges,†2021, [Online]. Available:

Q. Yao et al., “Taking the Human out of Learning Applications : A Survey on Automated Machine Learning,†pp. 1–20.

H. Alibrahim and S. A. Ludwig, “Hyperparameter Optimization: Comparing Genetic Algorithm against Grid Search and Bayesian Optimization,†in 2021 IEEE Congress on Evolutionary Computation (CEC), Jun. 2021, pp. 1551–1559, doi: 10.1109/CEC45853.2021.9504761.

Y. Wang, H. Zhang, and G. Zhang, “cPSO-CNN: An efficient PSO-based algorithm for fine-tuning hyper-parameters of convolutional neural networks,†Swarm Evol. Comput., vol. 49, pp. 114–123, Sep. 2019, doi: 10.1016/j.swevo.2019.06.002.

X. Xiao, M. Yan, S. Basodi, C. Ji, and Y. Pan, “Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm,†arXiv, vol. 1, 2020.

P. Singh, S. Chaudhury, and B. K. Panigrahi, “Hybrid MPSO-CNN: Multi-level Particle Swarm optimized hyperparameters of Convolutional Neural Network,†Swarm Evol. Comput., vol. 63, p. 100863, Jun. 2021, doi: 10.1016/j.swevo.2021.100863.

S. Fong, S. Deb, and X. Yang, “How Meta-heuristic Algorithms Contribute to Deep Learning in the Hype of Big Data Analytics,†in Progress in Intelligent Computing Techniques: Theory, Practice, and Applications, 2018, pp. 3–25.

V. Passricha and R. K. Aggarwal, “PSO-based optimized CNN for Hindi ASR,†Int. J. Speech Technol., vol. 22, no. 4, pp. 1123–1133, Dec. 2019, doi: 10.1007/s10772-019-09652-3.

A. Latif et al., “Content-Based Image Retrieval and Feature Extraction: A Comprehensive Review,†Math. Probl. Eng., vol. 2019, pp. 1–21, Aug. 2019, doi: 10.1155/2019/9658350.

F. Ghasemi, A. Mehridehnavi, A. Pérez-Garrido, and H. Pérez-Sánchez, “Neural network and deep-learning algorithms used in QSAR studies: merits and drawbacks,†Drug Discov. Today, vol. 23, no. 10, pp. 1784–1790, Oct. 2018, doi: 10.1016/j.drudis.2018.06.016.

L. M. R. Rere, M. I. Fanany, and A. M. Arymurthy, “Metaheuristic Algorithms for Convolution Neural Network,†Comput. Intell. Neurosci., vol. 2016, pp. 1–13, 2016, doi: 10.1155/2016/1537325.

T. Yamasaki, T. Honma, and K. Aizawa, “Efficient Optimization of Convolutional Neural Networks Using Particle Swarm Optimization,†in 2017 IEEE Third International Conference on Multimedia Big Data (BigMM), Apr. 2017, pp. 70–73, doi: 10.1109/BigMM.2017.69.

T. Sinha, A. Haidar, and B. Verma, “Particle Swarm Optimization Based Approach for Finding Optimal Values of Convolutional Neural Network Parameters,†in 2018 IEEE Congress on Evolutionary Computation (CEC), Jul. 2018, pp. 1–6, doi: 10.1109/CEC.2018.8477728.

E. Tuba, N. BaÄanin, I. Strumberger, and M. Tuba, “Convolutional Neural Networks Hyperparameters Tuning,†in Artificial Intelligence: Theory and Applications, 2021, pp. 65–84.

I. Koprinska, D. Wu, and Z. Wang, “Convolutional Neural Networks for Energy Time Series Forecasting,†in 2018 International Joint Conference on Neural Networks (IJCNN), Jul. 2018, pp. 1–8, doi: 10.1109/IJCNN.2018.8489399.

A. R. F. Dewandra, A. P. Wibawa, U. Pujianto, A. B. P. Utama, and A. Nafalski, “Journal Unique Visitors Forecasting Based on Multivariate Attributes Using CNN,†Int. J. Artif. Intell. Res., vol. 6, no. 1, 2022, doi:

M. A. Morid, O. R. L. Sheng, K. Kawamoto, and S. Abdelrahman, “Learning hidden patterns from patient multivariate time series data using convolutional neural networks: A case study of healthcare cost prediction,†J. Biomed. Inform., vol. 111, p. 103565, Nov. 2020, doi: 10.1016/j.jbi.2020.103565.

S. Liu, H. Ji, and M. C. Wang, “Nonpooling Convolutional Neural Network Forecasting for Seasonal Time Series With Trends,†IEEE Trans. Neural Networks Learn. Syst., vol. 31, no. 8, pp. 2879–2888, Aug. 2020, doi: 10.1109/TNNLS.2019.2934110.

A. P. Wibawa, Z. N. Izdihar, A. B. P. Utama, L. Hernandez, and Haviluddin, “Min-Max Backpropagation Neural Network to Forecast e-Journal Visitors,†in 2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Apr. 2021, pp. 052–058, doi: 10.1109/ICAIIC51459.2021.9415197.

T. Shintate and L. Pichl, “Trend Prediction Classification for High Frequency Bitcoin Time Series with Deep Learning,†J. Risk Financ. Manag., vol. 12, no. 1, p. 17, Jan. 2019, doi: 10.3390/jrfm12010017.

A. S. Lundervold and A. Lundervold, “An overview of deep learning in medical imaging focusing on MRI,†Z. Med. Phys., vol. 29, no. 2, pp. 102–127, May 2019, doi: 10.1016/j.zemedi.2018.11.002.

B. S. Kim and T. G. Kim, “Cooperation of Simulation and Data Model for Performance Analysis of Complex Systems,†Int. J. Simul. Model., vol. 18, no. 4, pp. 608–619, Dec. 2019, doi: 10.2507/IJSIMM18(4)491.

R. Yamashita, M. Nishio, R. K. G. Do, and K. Togashi, “Convolutional neural networks: an overview and application in radiology,†Insights Imaging, vol. 9, no. 4, pp. 611–629, Aug. 2018, doi: 10.1007/s13244-018-0639-9.

E. Lewinson, “Python for Finance Cookbook,†in Over 50 recipes for applying modern Python libraries to financial data analysis, 1st ed., Packt Publishing, 2020, p. 434.

J. Fregoso, C. I. Gonzalez, and G. E. Martinez, “Optimization of Convolutional Neural Networks Architectures Using PSO for Sign Language Recognition,†Axioms, vol. 10, no. 3, 2021, doi:

H.-Y. Tseng, P.-H. Chu, H.-C. Lu, and M.-J. Tsai, “Easy Particle Swarm Optimization for Nonlinear Constrained Optimization Problems,†IEEE Access, vol. 9, pp. 124757–124767, 2021, doi: 10.1109/ACCESS.2021.3110708.

W. Lu, W. Jiang, N. Zhang, and F. Xue, “Application of PSO-based LSTM Neural Network for Outpatient Volume Prediction,†J. Healthc. Eng., vol. 2021, pp. 1–9, Nov. 2021, doi: 10.1155/2021/7246561.

J. Oh, J. Wang, and J. Wiens, “Learning to Exploit Invariances in Clinical Time-Series Data using Sequence Transformer Networks,†pp. 1–15, 2018, [Online]. Available:

A. Parashar and A. Sonker, “Application of hyperparameter optimized deep learning neural network for classification of air quality data,†Int. J. Sci. Technol. Res., vol. 8, no. 11, pp. 1435–1443, 2019.

M. Tovar, M. Robles, and F. Rashid, “PV Power Prediction, Using CNN-LSTM Hybrid Neural Network Model. Case of Study: Temixco-Morelos, México,†Energies, vol. 13, no. 24, p. 6512, Dec. 2020, doi: 10.3390/en13246512.

C. Pelletier, G. Webb, and F. Petitjean, “Temporal Convolutional Neural Network for the Classification of Satellite Image Time Series,†Remote Sens., vol. 11, no. 5, p. 523, Mar. 2019, doi: 10.3390/rs11050523.

R. Zatarain Cabada, H. Rodriguez Rangel, M. L. Barron Estrada, and H. M. Cardenas Lopez, “Hyperparameter optimization in CNN for learning-centered emotion recognition for intelligent tutoring systems,†Soft Comput., vol. 24, no. 10, pp. 7593–7602, May 2020, doi: 10.1007/s00500-019-04387-4.

S. Sharma, S. Sharma, and A. Athaiya, “Activation Functions in Neural Networks,†Int. J. Eng. Appl. Sci. Technol., vol. 04, no. 12, pp. 310–316, 2020, doi: 10.33564/ijeast.2020.v04i12.054.

V. N. Sewdien, R. Preece, J. L. R. Torres, E. Rakhshani, and M. van der Meijden, “Assessment of critical parameters for artificial neural networks based short-term wind generation forecasting,†Renew. Energy, vol. 161, pp. 878–892, Dec. 2020, doi: 10.1016/j.renene.2020.07.117.

E. Okewu, P. Adewole, and O. Sennaike, “Experimental Comparison of Stochastic Optimizers in Deep Learning,†in Lecture Notes in Computer Science, 2019, pp. 704–715.

H. Abbasimehr, M. Shabani, and M. Yousefi, “An optimized model using LSTM network for demand forecasting,†Comput. Ind. Eng., vol. 143, no. July 2019, p. 106435, May 2020, doi: 10.1016/j.cie.2020.106435.

Q. Zheng, X. Tian, N. Jiang, and M. Yang, “Layer-wise learning based stochastic gradient descent method for the optimization of deep convolutional neural network,†J. Intell. Fuzzy Syst., vol. 37, no. 4, pp. 5641–5654, Oct. 2019, doi: 10.3233/JIFS-190861.

T. T. Kieu Tran, T. Lee, J. Y. Shin, J. S. Kim, and M. Kamruzzaman, “Deep learning-based maximum temperature forecasting assisted with meta-learning for hyperparameter optimization,†Atmosphere (Basel)., vol. 11, no. 5, pp. 1–21, 2020, doi: 10.3390/ATMOS11050487.

Z. Alameer, M. A. Elaziz, A. A. Ewees, H. Ye, and Z. Jianhua, “Forecasting gold price fluctuations using improved multilayer perceptron neural network and whale optimization algorithm,†Resour. Policy, vol. 61, no. September 2018, pp. 250–260, 2019, doi: 10.1016/j.resourpol.2019.02.014.


Additional Files






Citation Check