Download this article in .PDF format
This file type includes high resolution graphics and schematics.

Finally, the last big improvement, moving from OFDM to HT to VHT, is the increase in modulation/coding scheme complexity. For example, the HT specification introduced the 5/6 coding rate when using a 64QAM modulation scheme, resulting in an 11% increase in data rate over the 3/4 coding rate of OFDM, all other things being equal. The VHT specification also introduced the 256QAM modulation scheme, resulting in a 33% increase in the number of bits per symbol due to the higher order modulation employed. The 64QAM format has 6 b/symbol [log2(64)] while 256QAM pushes this to 8 b/symbol [log2(256)].

However, the dramatic improvement in throughput in the VHT specification comes at the expense of higher performance requirements and the needs for more complex test equipment. For example, the modulation accuracy required for devices to use more complex schemes such as 256QAM requires RF test systems to have greater linearity and better phase noise performance than what was required for previous iterations of the standard. More specifically, key transmitter measurements—such as error vector magnitude (EVM)—require careful attention to various settings on the RF signal analyzer used to measure the transmitter.

Measuring the modulation accuracy of a transmitter is one of the most important metrics in ensuring its intended performance. The key figure of merit usually considered, with respect to modulation accuracy, is EVM. To understand an EVM measurement, it is first necessary to realize that signals can be represented in the Cartesian domain as polar coordinates (Fig. 1). Each transmitted “state” of the carrier is called a “symbol” and is used to communicate a unique digital bit stream. For modulation schemes such as the 16QAM scheme shown in Fig. 1, each symbol is assigned a unique bit stream of four unique logical bits. To transmit a digital message bitstream, the message is divided into groups of 4 b, and the transmitter generates one symbol at a time at the prescribed “symbol rate.” At the other end, a receiver decodes each symbol and—applying the appropriate symbol mapping—reconstructs the original bit stream.

1. This is a constellation diagram representing 16QAM modulation.

As modulation schemes increase in complexity (i.e., from 16QAM to 256QAM), the relative phase and magnitude difference between adjacent symbols gets smaller. The EVM measurement is the primary metric of modulation quality, and is defined as the ratio of an “error vector” with the “magnitude vector.” The error vector can be graphically described as the difference in magnitude between a symbol’s measured phase and magnitude and its ideal location. The magnitude of this vector difference is known as the EVM. In order to measure EVM, a vector signal analyzer (VSA) which can measure both the amplitude and phase of received signals is required.

In practical use, accurately measuring the transmit EVM of an IEEE 802.11ac device requires paying careful attention to the signal-to-noise ratio (SNR) of the RF signal analyzer. The IEEE 802.11ac specification requires a transmitter to have a transmit EVM performance of -32 dB, inherently requiring that test instruments can measure EVM that is as much as 10 dB better (-42 dB). Moreover, the high peak-to-average power ratio (PAPR) of OFDM signals (often to 13 dB) used in IEEE 802.11ac requires that a signal analyzer have a SNR of upwards of 60 dB. As a result, characterizing the transmitter requires maximizing the SNR of the signal analyzer.

Appropriately setting the reference level of a signal analyzer is one of the easiest ways to maximize the SNR of the measuring instrument. Many instruments will offer an auto-level setting that will dynamically determine the peak amplitude of the IEEE 802.11 signals and set the reference level accordingly. However, using this option comes at the expense of measurement time when automating measurements. With the inherently greater measurement times of VHT signals (owing to the denser modulation options), it is desirable to set the reference level manually, thus reclaiming the time lost in this automation.

To maximize EVM performance, understanding the PAPR of the signal itself is a useful start to setting the reference level. Most OFDM signals have a PAPR that can range from 10 to 12 dB. In addition, most RF signal analyzers are designed so that their maximum clipping level occurs 6 to 8 dB above the reference level. As a result, the ideal setting for the reference level of a typical signal analyzer is usually 4 to 8 dB above the average power of the signal. Setting the reference level too low will result in the captured signal clipping and will degrade the measured EVM. If the reference level is set too high, the noise floor of the signal analyzer would influence the measurement result—also degrading the EVM measurement.

To illustrate this point, a model PXIe-5644R vector signal transceiver (VST) from National Instruments was used to generate and analyze an IEEE 802.11ac signal and various modulation-and-coding-scheme (MCS) rates and channel bandwidths. The PXIe-5644R is a signal generator and an analyzer in compact PXI module format. It covers 65 MHz to 6 GHz with an 80-MHz instantaneous bandwidth and features an impressive noise floor of -161 dBm/Hz . The output of the transceiver’s signal generator was directly connected to the input of the onboard signal analyzer. With the signal generator producing the IEEE 802.11ac signal at a constant power level, a series of modulation accuracy measurements were made while varying the reference level (Fig. 2).

2. Error vector magnitude (EVM) is shown as a function of reference level.

The x-axis is the difference between the reference level and the average packet power, in dB. The 0-dB mark represents the reference level and is equal to the average packet power. Negative numbers correspond to the reference level being set below this average power level, while positive values are a reference level set higher than the average power level.

The EVM measurement is dramatically degraded when the reference level is set too low, as clipping inside the signal analyzer significantly affects modulation quality. Moreover, as the reference level is slowly increased, the noise floor of the signal analyzer itself increasingly becomes the largest contributor to measurement uncertainty. The optimal reference level for this particular signal is to about 6 dB higher than the average power as the signal (Fig. 2).

Download this article in .PDF format
This file type includes high resolution graphics and schematics.