Abstract :
The performance of long term evolution (LTE) system can be improved by optimizing the downlink channel allocation, while the optimization demands the channel quality indicator (CQI) fed back by the user equipment(UE). However, since the channel is time-varying and there exists CQI feedback delay T, the feedback CQI can´t always represent the UE´s current channel state, leading to the sub-optimal allocation performance. In this paper, we propose to apply the wavelet transform-support vector machine (WT-SVM) method to predict the current effective signal to noise ratio (effective SNR) and its mapping CQI. We firstly study and confirm the chaotic behavior of effective SNR. Then the WT is used to decompose the effective SNR time series and C-C method is adopted to reconstruct the phase space of components decomposed from that series. After selecting the penalty parameter and kernel function coefficient by particle swarm optimization (PSO), we can predict the effective SNR by SVM. The simulations are then performed, and the estimate precision of our presented scheme is compared with that of three available methods, including the history information predicting, autoregression and back propagation neural network. Finally, the CQI mapped from predicted effective SNR are used for the downlink channel allocation in Vienna LTE Simulator to verify the effectiveness of our presented scheme in improving throughput and decreasing block error rate (BLER). The results demonstrate that WT-SVM has the minimum prediction error. Consequently, the WT-SVM based system can improve the average throughput by 10.84% and decrease the average BLER by 30.66% when T = 5 TTIs, relative to the history information predicting.