Open Access
ARTICLE
Slime Mold Optimizer for Transformer Parameters Identification with Experimental Validation
1 Department of Electrical Engineering, College of Engineering, Taif University, P.O. Box 11099, Taif, 21944, Saudi Arabia
2 Electrical Engineering Department, Faculty of Engineering, Al-Azhar University, Cairo, 11651, Egypt
3 Electrical Engineering Department, Faculty of Engineering, Northern Border University, Arar, 1321, Saudi Arabia
* Corresponding Author: Salah K. Elsayed. Email:
Intelligent Automation & Soft Computing 2021, 28(3), 639-651. https://doi.org/10.32604/iasc.2021.016464
Received 03 January 2021; Accepted 23 February 2021; Issue published 20 April 2021
Abstract
The problem of parameters identification for transformer equivalent circuit can be solved by optimizing a nonlinear formula. The objective function attempts to minimize the sum of squared relative errors amongst the accompanying calculated and actual points of currents, powers, and secondary voltage during the load test of the transformer subject to a set of parameters constraints. The authors of this paper propose applying a new and efficient stochastic optimizer called the slime mold optimization algorithm (SMOA) to identify parameters of the transformer equivalent circuit. The experimental measurements of load test of single- and three-phase transformers are entered to MATLAB code for extracting the transformer parameters through minimizing the objective function. Experimental verification of SMOA for parameter estimation of single- and three-phase transformers shows the capability and accuracy of SMOA in estimating these parameters. SMOA offers high performance and stability in determining optimal parameters to yield precise transformer performance. The results of parameters identification of transformer using SMOA are compared with the results using three optimization algorithms namely atom search optimizer, interior search algorithm, and sunflower optimizer. The comparisons are fairly performed in terms of the smallness of objective function. Comparisons shows that SMOA outperforms other contemporary algorithms at this task.Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.