Gan Zhu1, Yongtao Yu2,*, Xiaofan Deng1, Yuanchen Dai3, Zhenyuan Li3
CMES-Computer Modeling in Engineering & Sciences, Vol.145, No.3, pp. 4317-4348, 2025, DOI:10.32604/cmes.2025.074349
- 23 December 2025
Abstract Existing deep learning Network Intrusion Detection Systems (NIDS) struggle to simultaneously capture fine-grained, multi-scale features and long-range temporal dependencies. To address this gap, this paper introduces TransNeSt, a hybrid architecture integrating a ResNeSt block (using split-attention for multi-scale feature representation) with a Transformer encoder (using self-attention for global temporal modeling). This integration of multi-scale and temporal attention was validated on four benchmarks: NSL-KDD, UNSW-NB15, CIC-IDS2017, and CICIOT2023. TransNeSt consistently outperformed its individual components and several state-of-the-art models, demonstrating significant quantitative gains. The model achieved high efficacy across all datasets, with F1-Scores of 99.04% (NSL-KDD), 91.92% More >