I figured out that the huge difference is caused by the fact that the input power of the practical signal vs. the power of the waveform created in Matlab are different.
Difference in LTE 10 MHz BW EVM measurement between hPDSCHEVM function and keysight PXA signal analyzer n9030a
4 次查看(过去 30 天)
显示 更早的评论
I generate a LTE-Advanced signal of 10 MHz with testmodel number 3.1 with the LTE toolbox provided by Matlab (lteTestModel()). The received signal is created by adding additive white gaussian noise from the awgn() function to the transmitted signal. The input parameters of the awgn function are the SNR (dB) and the Signal Power (dBW) where the values are determined with the measurements from the PXA signal analyzer n9030a of Keysight with VSA (Vector signal analyzer) 89601B. When I try to measure the EVM with the hPDSCHEVM function of the LTE toolbox there is a difference between the average EVM value calculated in Matlab and the measured average EVM in the VSA. The real life signal is generated according to the specifications in testmodel 3.1 LTE. For example, the SNR from the signal analyzer is 21.441 dB with an EVM value of 8.00% 64-QAM PDSCH with a signal power of 0.741 dBm. I converted this input signal power to dBW and put in the signal power and SNR in the awgn function. This received waveform I put in the hPDSCHEVM function with channel configuration 'TestEVM' and the information of the testmodel 3.1 waveform. Then the EVM calculated with hPDSCHEVM is 9.639%. What can be the cause of this difference? Is it caused by the channel estimation 'TestEVM'? Is it caused by the generation of the received waveform?
回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!