Open Access iconOpen Access

ARTICLE

SEFormer: A Lightweight CNN-Transformer Based on Separable Multiscale Depthwise Convolution and Efficient Self-Attention for Rotating Machinery Fault Diagnosis

by Hongxing Wang1, Xilai Ju2, Hua Zhu1,*, Huafeng Li1,*

1 State Key Laboratory of Mechanics and Control for Aerospace Structures, Nanjing University of Aeronautics and Astronautics, Nanjing, 210016, China
2 School of Computer Science and Engineering, Nanyang Technological University, Singapore, 639798, Singapore

* Corresponding Authors: Hua Zhu. Email: email; Huafeng Li. Email: email

(This article belongs to the Special Issue: Industrial Big Data and Artificial Intelligence-Driven Intelligent Perception, Maintenance, and Decision Optimization in Industrial Systems)

Computers, Materials & Continua 2025, 82(1), 1417-1437. https://doi.org/10.32604/cmc.2024.058785

Abstract

Traditional data-driven fault diagnosis methods depend on expert experience to manually extract effective fault features of signals, which has certain limitations. Conversely, deep learning techniques have gained prominence as a central focus of research in the field of fault diagnosis by strong fault feature extraction ability and end-to-end fault diagnosis efficiency. Recently, utilizing the respective advantages of convolution neural network (CNN) and Transformer in local and global feature extraction, research on cooperating the two have demonstrated promise in the field of fault diagnosis. However, the cross-channel convolution mechanism in CNN and the self-attention calculations in Transformer contribute to excessive complexity in the cooperative model. This complexity results in high computational costs and limited industrial applicability. To tackle the above challenges, this paper proposes a lightweight CNN-Transformer named as SEFormer for rotating machinery fault diagnosis. First, a separable multiscale depthwise convolution block is designed to extract and integrate multiscale feature information from different channel dimensions of vibration signals. Then, an efficient self-attention block is developed to capture critical fine-grained features of the signal from a global perspective. Finally, experimental results on the planetary gearbox dataset and the motor roller bearing dataset prove that the proposed framework can balance the advantages of robustness, generalization and lightweight compared to recent state-of-the-art fault diagnosis models based on CNN and Transformer. This study presents a feasible strategy for developing a lightweight rotating machinery fault diagnosis framework aimed at economical deployment.

Graphic Abstract

SEFormer: A Lightweight CNN-Transformer Based on Separable Multiscale Depthwise Convolution and Efficient Self-Attention for Rotating Machinery Fault Diagnosis

Keywords


Cite This Article

APA Style
Wang, H., Ju, X., Zhu, H., Li, H. (2025). Seformer: A lightweight cnn-transformer based on separable multiscale depthwise convolution and efficient self-attention for rotating machinery fault diagnosis. Computers, Materials & Continua, 82(1), 1417-1437. https://doi.org/10.32604/cmc.2024.058785
Vancouver Style
Wang H, Ju X, Zhu H, Li H. Seformer: A lightweight cnn-transformer based on separable multiscale depthwise convolution and efficient self-attention for rotating machinery fault diagnosis. Comput Mater Contin. 2025;82(1):1417-1437 https://doi.org/10.32604/cmc.2024.058785
IEEE Style
H. Wang, X. Ju, H. Zhu, and H. Li, “SEFormer: A Lightweight CNN-Transformer Based on Separable Multiscale Depthwise Convolution and Efficient Self-Attention for Rotating Machinery Fault Diagnosis,” Comput. Mater. Contin., vol. 82, no. 1, pp. 1417-1437, 2025. https://doi.org/10.32604/cmc.2024.058785



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1017

    View

  • 364

    Download

  • 0

    Like

Share Link