Harmonic Distortion

NI PXI-5660

Harmonic distortion is a measure of the amount of power contained in the harmonics of a fundamental signal. Harmonic distortion is inherent to devices and systems that possess nonlinear characteristics—the more nonlinear the device, the greater its harmonic distortion.

Harmonic distortion can be expressed as a power ratio or as a percentage ratio. Use the following formula to express it as a power ratio:

where PHD is the power of the harmonic distortion in dBc, Pfund is the fundamental signal power in dB or dBm, and Pharm is the power of the harmonic of interest in dB or dBm.

Convert the powers to voltages to express harmonic distortion as a percentage ratio:

In some applications, the harmonic distortion is measured as a total percentage harmonic distortion (THD). This measurement involves the power summation of all the harmonics in the spectrum band, defined in the following equation:

A typical setup to perform a harmonic distortion measurement is shown in the figure below. A lowpass or bandpass filter passes the fundamental signal while suppressing its harmonics. This setup injects a very clean sinusoidal signal into the unit under test (UUT). Any harmonic content at the UUT output is assumed to be generated by the UUT instead of the source.

Typical Harmonic Distortion Measurement Setup

Understanding the RF Signal Analyzer Harmonic Distortion Limits

As with all analyzers, there are residual distortions inherent in the RF Signal Analyzer. It is important that these distortions do not corrupt your measurement.

The level of internal distortion is a function of the linearity of the system, which is primarily determined at the input mixer. Increasing input power at the mixer increases distortion, so if the input signal is too high, the internally generated harmonics overwhelm the harmonics of the original signal.

The specifications for the second- and third-order harmonic intercept points provide sufficient information about the linearity of the system. For example, to measure a second-order harmonic at –70 dBc, the fundamental power at the mixer input has to satisfy the following condition:

where IIP2 is the second-order intercept point.

If the input signal power is greater than this value, the signal must be attenuated before the first mixer. There is an upper limit on the amount of attenuation you can switch in because the noise floor rises by the same amount as the attenuation. To lower the noise level decrease the resolution bandwidth, but keep in mind that there is also a practical lower limit on the resolution bandwidth. Decreasing the resolution bandwidth increases measurement time.

The harmonic distortion dynamic range (HDDR) indicates the minimum distortion the RF Signal Analyzer can measure, which is about –96 dBc/Hz for the RF Signal Analyzer.

Choosing an Optimal Setting for the RF Signal Analyzer

Because the level of harmonic distortion is often unknown, the optimal attenuation level can be difficult to determine. Complete the following steps to find the proper attenuation setting for the RF Signal Analyzer:

  1. Set the attenuation so that the input power at the mixer is about –30 dBm. When using the RF Signal Analyzer Demo Panel,
      mixer level = reference level – attenuation.
  2. Tune to the harmonic frequency of interest and then decrease the resolution bandwidth until the harmonic spur appears.
  3. Increase the attenuation level. If the harmonic spur decreases, attenuate more.
  4. Repeat step 3 until the harmonic level does not decrease any further. Attenuation does not lower the harmonics of the original signal; it only lowers the internally generated ones.
  5. Decrease the resolution bandwidth to lower the noise floor.

This setting is the optimal attenuation setting.