Open Access
ARTICLE
Multi-Layer Reconstruction Errors Autoencoding and Density Estimate for Network Anomaly Detection
1 University of Electronic Science and Technology of China, Chengdu, 611730, China
2 Guangdong Weichen Information Technology Co., Ltd., Guangzhou, 510000, China
3 Chongqing Changan Automobile Corporation, Chongqing, 400000, China
* Corresponding Author: Ruikun Li. Email:
(This article belongs to the Special Issue: Intelligent Computing for Engineering Applications)
Computer Modeling in Engineering & Sciences 2021, 128(1), 381-398. https://doi.org/10.32604/cmes.2021.016264
Received 21 February 2021; Accepted 12 March 2021; Issue published 28 June 2021
Abstract
Anomaly detection is an important method for intrusion detection. In recent years, unsupervised methods have been widely researched because they do not require labeling. For example, a nonlinear autoencoder can use reconstruction errors to attain the discrimination threshold. This method is not effective when the model complexity is high or the data contains noise. The method for detecting the density of compressed features in a hidden layer can be used to reduce the influence of noise on the selection of the threshold because the density of abnormal data in hidden layers is smaller than normal data. However, compressed features may lose some of the high-dimensional distribution information of the original data. In this paper, we present an efficient anomaly detection framework for unsupervised anomaly detection, which includes network data capturing, processing, feature extraction, and anomaly detection. We employ a deep autoencoder to obtain compressed features and multi-layer reconstruction errors, and feeds them the same to the Gaussian mixture model to estimate the density. The proposed approach is trained and tested on multiple current intrusion detection datasets and real network scenes, and performance indicators, namely accuracy, recall, and F1-score, are better than other autoencoder models.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.