Hyperparameter tuning of deep neural network in time series forecasting

Xiang, Kelly Pang Li and Syafrina, Abdul Halim and Nur Haizum, Abd Rahman (2024) Hyperparameter tuning of deep neural network in time series forecasting. Menemui Matematik (Discovering Mathematics), 46 (1). pp. 47-73. ISSN 0126-9003. (Published)

[img]
Preview
Pdf
2024 Hyperparameter Tuning of Deep Neural Network in Time Series Forecasting.pdf
Available under License Creative Commons Attribution Non-commercial.

Download (8MB) | Preview

Abstract

Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The performance of DANN is highly dependent on the choice of hyperparameters. Random selection of the hyperparameters may increase DANN’s forecasting error. Hence, this study aims to optimize the performance of DANN in time series forecasting by tuning two important hyperparameters: the number of epochs and batch size. In this study, DANN with 1, 10, 20, 50 and 100 epochs, and batch sizes of 32 and 64 are used to grid search and form different combinations of hyperparameters. The performances of each model are evaluated and compared based on the mean square error (MSE) and mean absolute error (MAE). In addition, mean absolute percentage error (MAPE) is used to compare the performance of the DANN model on high-frequency and low-frequency time series data. Our study use simulated and real-life data to reveal the performance of the DANN model. The results show more than one epoch is needed to provide good performance. Specifically, analysis of simulated data consistently suggests that 10 epochs offer optimal results. Similarly, 10 epochs yield optimal results for low-frequency real-life data, while high-frequency real-life data prefers 100 epochs. Additionally, the finding indicates that batch sizes of 32 and 64 are optimal when used in different combinations. Hence, this study suggests that, in starting the learning process, it is crucial to perform hyperparameter tuning. This step ensures the selection of appropriate hyperparameter values, which significantly impact the learning outcome of a DNN model, leading to improved forecast accuracy results.

Item Type: Article
Uncontrolled Keywords: batch size, Deep Artificial Neural Network, epoch, forecasting, hyperparameter
Subjects: Q Science > QA Mathematics
Faculty/Division: Center for Mathematical Science
Depositing User: Miss Amelia Binti Hasan
Date Deposited: 05 Sep 2024 01:13
Last Modified: 05 Sep 2024 04:08
URI: http://umpir.ump.edu.my/id/eprint/42505
Download Statistic: View Download Statistics

Actions (login required)

View Item View Item