CNN-AttBiLSTM Mechanism: A DDoS Attack Detection Method Based on Attention Mechanism and CNN-BiLSTM

DDoS attacks occur frequently. This paper proposes a DDoS attack detection method that combines self attention mechanism with CNN-BiLSTM to address the issues of high dimensionality, multiple feature dimensions, low classification task accuracy, and high false positive rate in raw traffic data. Firs...

Full description

Bibliographic Details
Main Authors: Junjie Zhao, Yongmin Liu, Qianlei Zhang, Xinying Zheng
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10323325/
Description
Summary:DDoS attacks occur frequently. This paper proposes a DDoS attack detection method that combines self attention mechanism with CNN-BiLSTM to address the issues of high dimensionality, multiple feature dimensions, low classification task accuracy, and high false positive rate in raw traffic data. Firstly, the random forest algorithm is combined with Pearson correlation analysis to select important features as model inputs to reduce the redundancy of input data. Secondly, one-dimensional convolutional neural networks and bidirectional long-term and short-term memory networks are used to extract spatial and temporal features respectively, and then the extracted features are “parallelized” to obtain fused features. Once again, an attention mechanism is introduced to ensure that useful input information features are fully and completely expressed, and different weights are given based on the importance of different features. Finally, the softmax classifier is used to obtain the classification results. To verify the effectiveness of the proposed method, binary and multi classification experiments were conducted on the CIC-ISD2017 and CIC-DDoS2019 datasets. The experimental results show that compared with existing models, the proposed model has the highest accuracy, precision, recall, and F1 values of 95.670%, 95.824%, 95.904%, and 95.864%, respectively.
ISSN:2169-3536