Error Vector Magnitude (EVM)

LabView Modulation Toolkit

Error Vector Magnitude (EVM)

Error vector magnitude (EVM) is a measurement of demodulator performance in the presence of impairments. The soft symbol decisions obtained after decimating the recovered waveform at the demodulator output are compared against the ideal symbol locations. The root mean square (RMS) EVM and phase error are then used in determining the EVM measurement over a window of N demodulated symbols.

As shown in the following figure, the symbol decision generated by the demodulator is given by . However, the ideal symbol location (using the symbol map) is given by . Therefore, the resulting error vector is the difference between the actual measured and ideal symbol vectors given by  =  – . The error vector for a received symbol is graphically represented by the following figure:

where

is the ideal symbol vector
is the measured symbol vector
 –  is the magnitude error
θ is the phase error
 =  –  is the error vector
/ is the EVM

EVM quantifies, but does not necessarily reveal the nature of the impairment. To remove the dependence on system gain distribution, EVM is normalized by |v|, which is expressed as a percentage. Analytically, RMS EVM over a measurement window of N symbols is defined as

where

Ij is the I component of the j-th symbol received
Qj is the Q component of the j-th symbol received
is the ideal I component of the j-th symbol received
is the ideal Q component of the j-th symbol received

EVM is related to the modulation error ratio (MER) and ρ. EVM and MER have a one-to-one relationship. EVM measures the vector difference between the measured and ideal signals, while ρ measures the correlation between the two signals.