Open Access iconOpen Access

ARTICLE

Loss Aware Feature Attention Mechanism for Class and Feature Imbalance Issue

by Yuewei Wu1, Ruiling Fu1, Tongtong Xing1, Fulian Yin1,2,*

1 College of Information and Communication Engineering, Communication University of China, Beijing, 100024, China
2 State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing, 100024, China

* Corresponding Author: Fulian Yin. Email: email

Computers, Materials & Continua 2025, 82(1), 751-775. https://doi.org/10.32604/cmc.2024.057606

Abstract

In the Internet era, recommendation systems play a crucial role in helping users find relevant information from large datasets. Class imbalance is known to severely affect data quality, and therefore reduce the performance of recommendation systems. Due to the imbalance, machine learning algorithms tend to classify inputs into the positive (majority) class every time to achieve high prediction accuracy. Imbalance can be categorized such as by features and classes, but most studies consider only class imbalance. In this paper, we propose a recommendation system that can integrate multiple networks to adapt to a large number of imbalanced features and can deal with highly skewed and imbalanced datasets through a loss function. We propose a loss aware feature attention mechanism (LAFAM) to solve the issue of feature imbalance. The network incorporates an attention mechanism and uses multiple sub-networks to classify and learn features. For better results, the network can learn the weights of sub-networks and assign higher weights to important features. We propose suppression loss to address class imbalance, which favors negative loss by penalizing positive loss, and pays more attention to sample points near the decision boundary. Experiments on two large-scale datasets verify that the performance of the proposed system is greatly improved compared to baseline methods.

Keywords


Cite This Article

APA Style
Wu, Y., Fu, R., Xing, T., Yin, F. (2025). Loss aware feature attention mechanism for class and feature imbalance issue. Computers, Materials & Continua, 82(1), 751-775. https://doi.org/10.32604/cmc.2024.057606
Vancouver Style
Wu Y, Fu R, Xing T, Yin F. Loss aware feature attention mechanism for class and feature imbalance issue. Comput Mater Contin. 2025;82(1):751-775 https://doi.org/10.32604/cmc.2024.057606
IEEE Style
Y. Wu, R. Fu, T. Xing, and F. Yin, “Loss Aware Feature Attention Mechanism for Class and Feature Imbalance Issue,” Comput. Mater. Contin., vol. 82, no. 1, pp. 751-775, 2025. https://doi.org/10.32604/cmc.2024.057606



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 260

    View

  • 71

    Download

  • 0

    Like

Share Link