Modified state activation functions of deep learning-based SC-FDMA channel equalization system

Abstract The most important function of the deep learning (DL) channel equalization and symbol detection systems is the ability to predict the user’s original transmitted data. Generally, the behavior and performance of the deep artificial neural networks (DANNs) rely on three main aspects: the netw...

Full description

Bibliographic Details
Main Authors: Mohamed A. Mohamed, Hassan A. Hassan, Mohamed H. Essai, Hamada Esmaiel, Ahmed S. Mubarak, Osama A. Omer
Format: Article
Language:English
Published: SpringerOpen 2023-11-01
Series:EURASIP Journal on Wireless Communications and Networking
Subjects:
Online Access:https://doi.org/10.1186/s13638-023-02326-4
_version_ 1827604497346592768
author Mohamed A. Mohamed
Hassan A. Hassan
Mohamed H. Essai
Hamada Esmaiel
Ahmed S. Mubarak
Osama A. Omer
author_facet Mohamed A. Mohamed
Hassan A. Hassan
Mohamed H. Essai
Hamada Esmaiel
Ahmed S. Mubarak
Osama A. Omer
author_sort Mohamed A. Mohamed
collection DOAJ
description Abstract The most important function of the deep learning (DL) channel equalization and symbol detection systems is the ability to predict the user’s original transmitted data. Generally, the behavior and performance of the deep artificial neural networks (DANNs) rely on three main aspects: the network structure, the learning algorithms, and the activation functions (AFs) used in each node in the network. Long short-term memory (LSTM) recurrent neural networks have shown some success in channel equalization and symbol detection. The AFs used in the DANN play a significant role in how the learning algorithms converge. Our article shows how modifying the AFs used in the tanh units (block input and output) of the LSTM units can significantly boost the DL equalizer's performance. Additionally, the learning process of the DL model was optimized with the help of two distinct error-measuring functions: default (cross-entropy) and sum of squared error (SSE). The DL model's performance with different AFs is compared. This comparison is conducted using three distinct learning algorithms: Adam, RMSProp, and SGdm. The findings clearly demonstrate that the most frequently used AFs (sigmoid and hyperbolic tangent functions) do not really make a significant contribution to perfect network behaviors in channel equalization. On the other hand, there are a lot of non-common AFs that can outperform the frequently employed ones. Furthermore, the outcomes demonstrate that the recommended loss functions (SSE) exhibit superior performance in addressing the channel equalization challenge compared to the default loss functions (cross-entropy).
first_indexed 2024-03-09T06:01:56Z
format Article
id doaj.art-0df9b6200be04736b6248b7077029c39
institution Directory Open Access Journal
issn 1687-1499
language English
last_indexed 2024-03-09T06:01:56Z
publishDate 2023-11-01
publisher SpringerOpen
record_format Article
series EURASIP Journal on Wireless Communications and Networking
spelling doaj.art-0df9b6200be04736b6248b7077029c392023-12-03T12:07:30ZengSpringerOpenEURASIP Journal on Wireless Communications and Networking1687-14992023-11-012023112610.1186/s13638-023-02326-4Modified state activation functions of deep learning-based SC-FDMA channel equalization systemMohamed A. Mohamed0Hassan A. Hassan1Mohamed H. Essai2Hamada Esmaiel3Ahmed S. Mubarak4Osama A. Omer5Department of Electrical Engineering, Faculty of Engineering, Al-Azhar UniversityDepartment of Electrical Engineering, Faculty of Engineering, Al-Azhar UniversityDepartment of Electrical Engineering, Faculty of Engineering, Al-Azhar UniversityDepartment of Electrical Engineering, Faculty of Engineering, Aswan UniversityDepartment of Electrical Engineering, Faculty of Engineering, Aswan UniversityDepartment of Electrical Engineering, Faculty of Engineering, Aswan UniversityAbstract The most important function of the deep learning (DL) channel equalization and symbol detection systems is the ability to predict the user’s original transmitted data. Generally, the behavior and performance of the deep artificial neural networks (DANNs) rely on three main aspects: the network structure, the learning algorithms, and the activation functions (AFs) used in each node in the network. Long short-term memory (LSTM) recurrent neural networks have shown some success in channel equalization and symbol detection. The AFs used in the DANN play a significant role in how the learning algorithms converge. Our article shows how modifying the AFs used in the tanh units (block input and output) of the LSTM units can significantly boost the DL equalizer's performance. Additionally, the learning process of the DL model was optimized with the help of two distinct error-measuring functions: default (cross-entropy) and sum of squared error (SSE). The DL model's performance with different AFs is compared. This comparison is conducted using three distinct learning algorithms: Adam, RMSProp, and SGdm. The findings clearly demonstrate that the most frequently used AFs (sigmoid and hyperbolic tangent functions) do not really make a significant contribution to perfect network behaviors in channel equalization. On the other hand, there are a lot of non-common AFs that can outperform the frequently employed ones. Furthermore, the outcomes demonstrate that the recommended loss functions (SSE) exhibit superior performance in addressing the channel equalization challenge compared to the default loss functions (cross-entropy).https://doi.org/10.1186/s13638-023-02326-4Activation functionsDeep artificial neural networksDeep learningChannel equalizationSymbol detectionLong-short-term memory
spellingShingle Mohamed A. Mohamed
Hassan A. Hassan
Mohamed H. Essai
Hamada Esmaiel
Ahmed S. Mubarak
Osama A. Omer
Modified state activation functions of deep learning-based SC-FDMA channel equalization system
EURASIP Journal on Wireless Communications and Networking
Activation functions
Deep artificial neural networks
Deep learning
Channel equalization
Symbol detection
Long-short-term memory
title Modified state activation functions of deep learning-based SC-FDMA channel equalization system
title_full Modified state activation functions of deep learning-based SC-FDMA channel equalization system
title_fullStr Modified state activation functions of deep learning-based SC-FDMA channel equalization system
title_full_unstemmed Modified state activation functions of deep learning-based SC-FDMA channel equalization system
title_short Modified state activation functions of deep learning-based SC-FDMA channel equalization system
title_sort modified state activation functions of deep learning based sc fdma channel equalization system
topic Activation functions
Deep artificial neural networks
Deep learning
Channel equalization
Symbol detection
Long-short-term memory
url https://doi.org/10.1186/s13638-023-02326-4
work_keys_str_mv AT mohamedamohamed modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem
AT hassanahassan modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem
AT mohamedhessai modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem
AT hamadaesmaiel modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem
AT ahmedsmubarak modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem
AT osamaaomer modifiedstateactivationfunctionsofdeeplearningbasedscfdmachannelequalizationsystem