Open Access
ARTICLE
Age and Gender Classification Using Backpropagation and Bagging Algorithms
1 Research and Innovation Department, Skyline University College, P. O. Box 1797, Sharjah, UAE
2 IT-department- Al-Huson University College, Al-Balqa Applied University, P. O. Box 50, Irbid, Jordan
3 Prince Abdullah Ben Ghazi Faculty of Information and Communication Technology, Al-Balqa Applied University, Al-Salt, Jordan
4 School of Information Technology, Skyline University College, Sharjah P. O. Box 1797, United Arab Emirates
5 Department of Information Security, Faculty of Information Technology, University of Petra, Amman, Jordan
6 Department of Computer Science and Information Engineering, Asia University, Taizhong, Taiwan
7 Department of Computer Science, King Abdulaziz University, Jeddah, Saudi Arabia
* Corresponding Author: Ammar Almomani. Email:
Computers, Materials & Continua 2023, 74(2), 3045-3062. https://doi.org/10.32604/cmc.2023.030567
Received 29 March 2022; Accepted 17 June 2022; Issue published 31 October 2022
Abstract
Voice classification is important in creating more intelligent systems that help with student exams, identifying criminals, and security systems. The main aim of the research is to develop a system able to predicate and classify gender, age, and accent. So, a new system called Classifying Voice Gender, Age, and Accent (CVGAA) is proposed. Backpropagation and bagging algorithms are designed to improve voice recognition systems that incorporate sensory voice features such as rhythm-based features used to train the device to distinguish between the two gender categories. It has high precision compared to other algorithms used in this problem, as the adaptive backpropagation algorithm had an accuracy of 98% and the Bagging algorithm had an accuracy of 98.10% in the gender identification data. Bagging has the best accuracy among all algorithms, with 55.39% accuracy in the voice common dataset and age classification and accent accuracy in a speech accent of 78.94%.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.