Open Access
ARTICLE
Stability Prediction in Smart Grid Using PSO Optimized XGBoost Algorithm with Dynamic Inertia Weight Updation
Department of Software Engineering College of Computer Engineering and Sciences in Al-Kharj, Prince Sattam Bin Abdulaziz University, Al-Kharj, 11942, Saudi Arabia
* Corresponding Author: Adel Binbusayyis. Email:
(This article belongs to the Special Issue: Advanced Artificial Intelligence and Machine Learning Methods Applied to Energy Systems)
Computer Modeling in Engineering & Sciences 2025, 142(1), 909-931. https://doi.org/10.32604/cmes.2024.058202
Received 06 September 2024; Accepted 06 November 2024; Issue published 17 December 2024
Abstract
Prediction of stability in SG (Smart Grid) is essential in maintaining consistency and reliability of power supply in grid infrastructure. Analyzing the fluctuations in power generation and consumption patterns of smart cities assists in effectively managing continuous power supply in the grid. It also possesses a better impact on averting overloading and permitting effective energy storage. Even though many traditional techniques have predicted the consumption rate for preserving stability, enhancement is required in prediction measures with minimized loss. To overcome the complications in existing studies, this paper intends to predict stability from the smart grid stability prediction dataset using machine learning algorithms. To accomplish this, pre-processing is performed initially to handle missing values since it develops biased models when missing values are mishandled and performs feature scaling to normalize independent data features. Then, the pre-processed data are taken for training and testing. Following that, the regression process is performed using Modified PSO (Particle Swarm Optimization) optimized XGBoost Technique with dynamic inertia weight update, which analyses variables like gamma(G), reaction time (tau1–tau4), and power balance (p1–p4) for providing effective future stability in SG. Since PSO attains optimal solution by adjusting position through dynamic inertial weights, it is integrated with XGBoost due to its scalability and faster computational speed characteristics. The hyperparameters of XGBoost are fine-tuned in the training process for achieving promising outcomes on prediction. Regression results are measured through evaluation metrics such as MSE (Mean Square Error) of 0.011312781, MAE (Mean Absolute Error) of 0.008596322, and RMSE (Root Mean Square Error) of 0.010636156 and MAPE (Mean Absolute Percentage Error) value of 0.0052 which determine the efficacy of the system.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.