At cellular wireless communication systems, channel estimation (CE) is one of the key techniques that are used in Orthogonal Frequency Division Multiplexing modulation (OFDM). The most common methods are Decision‐Directed Channel Estimation, Pilot-Assisted Channel Estimation (PACE) and blind channel estimation. Among them, PACE is commonly used and has a steadier performance. Applying deep learning (DL) methods in CE is getting increasing interest of researchers during the past 3 years. The main objective of this paper is to assess the efficiency of DL-based CE compared to the conventional PACE techniques, such as least-square (LS) and minimum mean-square error (MMSE) estimators. A simulation environment to evaluate OFDM performance at different channel models has been used. A DL process that estimates the channel from training data is also employed to get the estimated impulse response of the channel. Two channel models have been used in the comparison: Tapped Delay Line and Clustered Delay Line channel models. The performance is evaluated under different parameters including number of pilots (64 pilots or 8 pilots), number of subcarriers (64), the length of cyclic prefix (16 or 0 samples) and carrier frequency (4 GHz) through computer simulation using MATLAB. From the simulation results, the trained DL estimator provides better results in estimating the channel and detecting the transmitted symbols compared to LS and MMSE estimators although, the complexity of the proposed LSTM estimator exceeds the equivalent LS estimator. Furthermore, the DL estimator also demonstrates its effectiveness with various pilot densities and with different cyclic prefix periods. |