Research on Computational Intelligence Algorithm in LTE Channel Estimation

Because data traffic is growing at a rapid pace thanks to advancements in the Internet of Things, precise modelling and precisely anticipating Long-Term Evolution (LTE) Channel is critical for a variety of applications like as video streaming, effective bandwidth consumption, and power management. I...

Full description

Bibliographic Details
Main Authors: Siraj Pathan, Sanjay Singh, Mujib Tamboli, Sunil Pathak
Format: Article
Language:English
Published: Universidad Autónoma de Bucaramanga 2022-12-01
Series:Revista Colombiana de Computación
Subjects:
Online Access:http://revistas.unab.edu.co/index.php/rcc/article/view/4308
Description
Summary:Because data traffic is growing at a rapid pace thanks to advancements in the Internet of Things, precise modelling and precisely anticipating Long-Term Evolution (LTE) Channel is critical for a variety of applications like as video streaming, effective bandwidth consumption, and power management. In this research, we propose a model based on a Computational Intelligence (CI) Algorithm that may enhance Channel Estimation based on received signal. Two Algorithms are considered. In contrast to previous work that focused solely on designing models to estimate channel using traditional Minimum Mean Square Error (MMSE) and Least Square (LS) algorithms, we used 1) GA (Genetic Algorithm) and 2) PSO (Particle Swarm Optimization Algorithm) to work on Discrete and Continuous Long-Term Evolution (LTE) drive test data. We're looking at LTE in the 5.8 GHz band in particular. By lowering the mean square error of LS and the complexity of MMSE, the design model attempts to improve channel estimation. Pilots are put at random and sent with data to gather channel information, which aids the receiver in decoding and estimating the channel using LS, MMSE, Taguchi GA, and PSO. The Bit Error Rate (BER), Signal to Noise Ratio, and Mean Square Error of a CI-based model have all been estimated. In comparison to the MMSE and LS algorithms, the proposed model BER achieves the target gain of 2.4 dB and 5.4 dB.
ISSN:1657-2831
2539-2115