An Approach to Optimization of Gated Recurrent Unit with Greedy Algorithm
Abstract
This study focuses on enhancing the performance of Stacked Gated Recurrent Unit (GRU) model in time series data processing, specifically in stock price prediction. The most significant innovation occurs in the integration of a Greedy Algorithm for optimizing hyperparameters such as look-back period, number of epochs, batch size, and units in each GRU layer. Historical stock data from Apple Inc. is utilized for the model's application, emphasizing the model's effectiveness in predicting stock prices. The study methodology involves a sequence of steps, such as data loading, preprocessing, dataset splitting, model construction and evaluation. The role of the Greedy Algorithm's focuses in iteratively adjusting hyperparameters to minimize the Root Mean Squared Error (RMSE) metric, resulting in refining the model's predictive accuracy. The outcomes reveal that the integrated Greedy Algorithm significantly enhances the model's accuracy in predicting stock prices, indicating its potential application in various scenarios requiring precise time series forecasting.
References
[2] S. Zivkovic, “RNN_Tackling Vanishing Gradients with GRU and LSTM”, DataHacker, September 24, 2020. [Online]. Available: https://datahacker.rs/005-rnn-tackling-vanishing-gradients-with-gru-and-lstm/ [Accessed January 10, 2024].
[3] Data Science Team, “Gated Recurrent Units – Understanding the Fundamentals”, DataScience, June 02, 2022. [Online]. Available: https://datascience.eu/machine-learning/gated-recurrent-units-understanding-the-fundamentals/ [Accessed January 10, 2024].
[4] F. Pan, J. Li, B. Tan, C. Zeng, X. Jiang, L. Liu, and J. Yang, “Stacked-GRU Based Power System Transient Stability Assessment Method”, MDPI, August 09, 2018. [Online]. Available: https://www.mdpi.com/1999-4893/11/8/121/htm [Accessed January 10, 2024].
[5] V. Rai, “Recurrent Neural Networks (RNNs): Understanding Sequential Data Processing”, InterviewKickstart, December 27, 2023. [Online]. Available: https://www.interviewkickstart.com/learn/recurrent-neural-networks-sequential-data-processing [Accessed January 10, 2024].
[6] Baeldung, “Prevent the Vanishing Gradient”, Baeldung, January 11, 2024. [Online]. Available: https://www.baeldung.com/cs/lstm-vanishing-gradient-prevention [Accessed January 10, 2024].
[7] Aishwarya, “Introduction to Recurrent Neural Network”, GeeksforGeeks, December 04, 2023. [Online]. Available: https://www.geeksforgeeks.org/introduction-to-recurrent-neural-network/ [Accessed January 10, 2024].
[8] S. Kostadinov, “Understanding GRU Networks”, Towards Data Science, December 16, 2017. [Online]. Available: https://towardsdatascience.com/understanding-gru-networks-2ef37df6c9be [Accessed January 10, 2024].
[9] S. Selvidge, “Min-Max Data Normalization in Python: Best Practices”, CeleryQ, September 20, 2023. [Online]. Available: https://celeryq.org/min-max-normalization-python/ [Accessed January 10, 2024].
[10] T. B. Shahi, A. Shrestha, A. Neupane, and W. Guo, “Stock Price Forecasting with Deep Learning: A Comparative Study”, MDPI, August 27, 2020. [Online]. Available: https://www.mdpi.com/2227-7390/8/9/1441 [Accessed January 10, 2024].
[11] Y. Tang, “Build a GRU RNN in Keras”, PythonAlgos, January 02, 2022. [Online]. Available: https://pythonalgos.com/2022/01/02/build-a-gru-rnn-in-keras/ [Accessed January 10, 2024].
[12] J. Brownlee, “Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras”, Machine Learning Mastery, August 07, 2022. [Online]. Available: https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/ [Accessed January 10, 2024].
I (we), the author(s), hereby declare under full moral, financial and criminal liability that the manuscript submitted for publication to the Journal of Computer and Forensic Sciences
a) is the result of my (our) own original research and that I (we) hold the right to publish it;
b) does not infringe any copyright or other third-party proprietary rights;
c) complies with the Journal’s research and publishing ethics standards;
d) has not been published elsewhere, under this or any other title;
e) is not under consideration by another publication, under this or any other title.
I (we) also declare under full moral, financial and criminal liability:
f) that all conflicts of interest that may directly or potentially influence or impart bias on the work have been disclosed in the manuscript;
g) that if the article has been accepted for publishing I (we) will transfer all copyright ownership of the manuscript to the University of Criminal Investigation and Police Studies in Belgrade.
Signed by the Corresponding Author on behalf of the all other authors.