A Comparative Analysis of Deep Neural Networks and Gradient Boosting Algorithms in Long-Term Wind Power Forecasting

  • Luka Ivanovic Electrical Engineering Institute Nikola Tesla
  • Saša Milić Electrical Engineering Institute Nikola Tesla
  • Živko Sokolović Electrical Engineering Institute Nikola Tesla
  • Aleksandar Rakić The department of Signals and Systems School of Electrical Engineering
Keywords: Machine Learning (ML), Reccurrent Neural Network (RNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Gradient Boosting Algorithm (GBM), XGBoost, Wind Farm, Power Generation

Abstract


A vital step toward a sustainable future is the power grid's incorporation of renewable energy sources. Wind energy is significant because of its broad availability and minimal environmental impact. The paper presents a comparative analysis of recurrent neural network (RNN) algorithms and gradient boosting algorithms (GBMs) applied to time series data for the regression issue of estimating the active power generated by a wind farm (WF). GBM algorithms combine the advantages of a few machine learning models (decision trees, random forests, etc.) to produce a powerful prediction model. In addition to existing conventional RNN, the article deals with long short-term memory LSTM (LSTM) and gate ruccerrent unit (GRU) as cut of the edge models for time series prediction. In addition to conventional RNNs, the article deals with long short-term memory LSTM (LSTM) and gated recurrent unit (GRU) as cutting-edge models for time series analysis and predictions. A comprehensive analysis was carried out on a large wind power generation data set.

References

Z. Tian, "A State-Of-The-Art Review on Wind Power Deterministic Predictio," Wind Engineering, vol. 45, pp. 1374–1392, 2021.

I. H. Sarker, "Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions," SN Computer Science, vol. 2, 420, 2021.

A. Zhang, Z. C. Lipton, M. Li, A. J. Smola, Dive into Deep Learning. 2023. [Online]. Available: https://d2l.ai/

P. J. Werbos, "Backpropagation through time: what it does and how to do it," Proceedings of the IEEE, vol. 78, no. 10, pp. 1550-1560, 1990.

I. Danish & Abbas, Asad. (2024). A Deep Dive into Neural Networks: Architectures, Training Techniques, and Practical Implementations. Journal of Environmental Sciences and Technology, vol. 02, pp. 61−71, 2023. https://doi.org/10.13140/RG.2.2.14866.84162

G. Van Houdt, C. Mosquera, G. Nápoles, "A Review on the Long Short-Term Memory Model," Artificial Intelligence Review, vol. 53, pp. 5929−5955, 2020.

Y. Freund, R. E. Schapire, "A Short Introduction to Boosting," Journal of Japanese Society for Artificial Intelligence, vol. 14, pp. 771−780, 1999.

J. H. Friedman, "Greedy function approximation: a Gradient Boosting machine," The Annals of Statistics, vol. 29, pp. 1189–1232, 2001.

T. Chen, C. Guestrin. "Xgboost: A scalable tree boosting system," in Proc. 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, (New York, NY, USA, 13 Aug. 2016), p.p. 785–794.

A. Mobarak, A. Y. Owda, M. Owda, "Electrical Load Forecasting Using LSTM, GRU, and RNN Algorithms," Energies, vol. 16, 2283, 2023.

M.Chen, et al., "XGBoost-Based Algorithm Interpretation and Application on Post-Fault Transient Stability Status Prediction of Power System", IEEE Access, vol. 7, pp. 13149−13158, 2019.

Data available: https://www.kaggle.com/datasets/mubashirrahim/wind-power-generation-data-forecasting/data

J. Chung, et al., "Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling," 2015. [Online]. Available: https://doi.org/10.48550/arXiv.1412.3555

Published
2024/10/17
Section
Original Scientific Paper