CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification

Amidst the evolving landscape of non-cooperative communication, automatic modulation classification (AMC) stands as an essential pillar, enabling adaptive and reliable signal processing. Due to the advancement of deep learning (DL) technology, neural networks have found application in AMC. However,...

Full description

Bibliographic Details
Main Authors: Gujiuxiang Gao, Xin Hu, Boyan Li, Weidong Wang, Fadhel M. Ghannouchi
Format: Article
Language:English
Published: MDPI AG 2023-11-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/12/22/4668
_version_ 1797459474130141184
author Gujiuxiang Gao
Xin Hu
Boyan Li
Weidong Wang
Fadhel M. Ghannouchi
author_facet Gujiuxiang Gao
Xin Hu
Boyan Li
Weidong Wang
Fadhel M. Ghannouchi
author_sort Gujiuxiang Gao
collection DOAJ
description Amidst the evolving landscape of non-cooperative communication, automatic modulation classification (AMC) stands as an essential pillar, enabling adaptive and reliable signal processing. Due to the advancement of deep learning (DL) technology, neural networks have found application in AMC. However, the previous DL models face the inter-class confusion problem in high-order modulations. To address this issue, we propose a multitask-learning-empowered hybrid neural network, named CrossTLNet. Specifically, after the signal enters the model, it is first transformed into two task components: in-phase/quadrature (I/Q) form and amplitude/phase (A/P) form. For each task, we design a method that combines a temporal convolutional network (TCN) with a long short-term memory (LSTM) network to effectively capture long-term dependency features in high-order modulations. To enable interaction between these two different dimensional features, we innovatively introduce a cross-attention method, thereby further enhancing the model’s ability to distinguish signal features. Moreover, we also design a simple and efficient knowledge distillation method to reduce the size of CrossTLNet, making it easier to deploy in real-time or resource-limited scenarios. The experimental results indicate that the suggested method exhibits exceptional performance in AMC on public benchmarks, especially in high-order modulations.
first_indexed 2024-03-09T16:52:53Z
format Article
id doaj.art-8d92e6da8a484877b700f52b8a0cf456
institution Directory Open Access Journal
issn 2079-9292
language English
last_indexed 2024-03-09T16:52:53Z
publishDate 2023-11-01
publisher MDPI AG
record_format Article
series Electronics
spelling doaj.art-8d92e6da8a484877b700f52b8a0cf4562023-11-24T14:39:36ZengMDPI AGElectronics2079-92922023-11-011222466810.3390/electronics12224668CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation ClassificationGujiuxiang Gao0Xin Hu1Boyan Li2Weidong Wang3Fadhel M. Ghannouchi4School of Electronic Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Electronic Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Electronic Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Electronic Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaiRadio Lab, University of Calgary, Calgary, AB T2N 1N4, CanadaAmidst the evolving landscape of non-cooperative communication, automatic modulation classification (AMC) stands as an essential pillar, enabling adaptive and reliable signal processing. Due to the advancement of deep learning (DL) technology, neural networks have found application in AMC. However, the previous DL models face the inter-class confusion problem in high-order modulations. To address this issue, we propose a multitask-learning-empowered hybrid neural network, named CrossTLNet. Specifically, after the signal enters the model, it is first transformed into two task components: in-phase/quadrature (I/Q) form and amplitude/phase (A/P) form. For each task, we design a method that combines a temporal convolutional network (TCN) with a long short-term memory (LSTM) network to effectively capture long-term dependency features in high-order modulations. To enable interaction between these two different dimensional features, we innovatively introduce a cross-attention method, thereby further enhancing the model’s ability to distinguish signal features. Moreover, we also design a simple and efficient knowledge distillation method to reduce the size of CrossTLNet, making it easier to deploy in real-time or resource-limited scenarios. The experimental results indicate that the suggested method exhibits exceptional performance in AMC on public benchmarks, especially in high-order modulations.https://www.mdpi.com/2079-9292/12/22/4668automatic modulation classificationtemporal convolutional networklong short-term memory networkcross-attentionmultitask learningknowledge distillation
spellingShingle Gujiuxiang Gao
Xin Hu
Boyan Li
Weidong Wang
Fadhel M. Ghannouchi
CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
Electronics
automatic modulation classification
temporal convolutional network
long short-term memory network
cross-attention
multitask learning
knowledge distillation
title CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
title_full CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
title_fullStr CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
title_full_unstemmed CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
title_short CrossTLNet: A Multitask-Learning-Empowered Neural Network with Temporal Convolutional Network–Long Short-Term Memory for Automatic Modulation Classification
title_sort crosstlnet a multitask learning empowered neural network with temporal convolutional network long short term memory for automatic modulation classification
topic automatic modulation classification
temporal convolutional network
long short-term memory network
cross-attention
multitask learning
knowledge distillation
url https://www.mdpi.com/2079-9292/12/22/4668
work_keys_str_mv AT gujiuxianggao crosstlnetamultitasklearningempoweredneuralnetworkwithtemporalconvolutionalnetworklongshorttermmemoryforautomaticmodulationclassification
AT xinhu crosstlnetamultitasklearningempoweredneuralnetworkwithtemporalconvolutionalnetworklongshorttermmemoryforautomaticmodulationclassification
AT boyanli crosstlnetamultitasklearningempoweredneuralnetworkwithtemporalconvolutionalnetworklongshorttermmemoryforautomaticmodulationclassification
AT weidongwang crosstlnetamultitasklearningempoweredneuralnetworkwithtemporalconvolutionalnetworklongshorttermmemoryforautomaticmodulationclassification
AT fadhelmghannouchi crosstlnetamultitasklearningempoweredneuralnetworkwithtemporalconvolutionalnetworklongshorttermmemoryforautomaticmodulationclassification