Fourth-generation (4G) wireless communications systems promise to support the demands of customers for increasing network capacity for voice, data, and streaming video. Systems such as Long Term Evolution (LTE) networks incorporate novel communications techniques, such as multiple-input, multiple-output (MIMO) antenna configurations using two or more transmit/receive chains, and takes advantage of the differences in radio transmission paths between them. But such approaches also challenge the capabilities of traditional and emerging measurement equipment and test software. Fortunately, flexible test solutions are currently available with the capabilities of emulating LTE MIMO operation during receiver (Rx) and transmitter (Tx) device testing.
LTE requires fundamental changes in base station and handset design and test due to departures from previous cellular communications networks, including its requirement to handle six different channel bandwidths from 1.4 to 20 MHz in both frequency- division-duplex (FDD) and timedivision- duplex (TDD) modes. LTE also operates with multiple antenna techniques to improve link performance. So far, five multiple-antenna techniques have been defined for LTE networks, including receive diversity techniques at the mobile unit, transmit diversity at the evolved Node B (eNB) base station using space-frequency block coding (SFBC), MIMO spatial multiplexing at the eNB base station for one or two users, cyclic delay diversity (CDD) at the eNB base station used in conjunction with spatial multiplexing, and beam-steering techniques (specific to a user).
Receiver testing should provide a full evaluation of the receiver as a whole. Receiver subblocks and components must also be tested in order to understand their contributions to overall receiver performance. Where multiple receivers are used, each receive chain must be tested separately before attempting to verify the performance of the receivers with multiple antennas. The same principles apply to both FDD and TDD access modes. Figure 1 shows a simplified block diagram of a typical LTE handset radio.
Because the receivers in modern user equipment (UE) are highly integrated, incorporating many integrated circuits (ICs) that perform multiple functions, testing can be difficult because of the limited number of locations to inject and detect the various signals that emulate an LTE system.
For evaluating individual components in an LTE receiver, open-loop testing, where the receiver under test does not send feedback information to the source, is usually adequate to characterize the performance of these components and serve as a first step in validating the demodulation algorithms in the baseband section. But full verification of the overall receiver performance under real-world conditions requires closed-loop testing through a channel with fading characteristics. As part of closed-loop testing, lost packets are retransmitted using incremental redundancy based on real-time packet acknowledgment. Feedback is also a part of the modulation and coding schemes used in LTE receivers, and the testing must reflect the dynamic nature of these signal characteristics. The feedback for modulation and coding may be optimized for sub-bands within the overall channel bandwidth to enable frequency-selective scheduling.
In many modern communications systems, including LTE systems, the downconverted intermediate-frequency (IF) signal is usually digitized by an analog-todigital converter (ADC) and then fed to the baseband section for demodulation and decoding. Analyzing the output of the ADC poses a challenge because the output is now in the digital domain. One solution is to capture the digital data directly using a logic analyzer. While most logic analyzer applications are not focused on generating RF metrics, the Agilent 89601A Vector Signal Analysis software from Agilent Technologies, which can be run on a number of the firm's different spectrum analyzers, logic analyzers, and oscilloscopes, offers a unique way to analyze an LTE system's ADC performance, It does this by performing traditional RF measurements but based on digital data; it gives a designer the ability to quantify the ADC's contributions to overall LTE or other type of wireless communications system performance.
An LTE base station receiver faces many of the same MIMO challenges, but in addition has to receive data simultaneously from multiple users. From the point of view of multi-user MIMO (MU-MIMO), each signal comes from a separate UE unit, and therefore has a completely independent channel, with different power levels and different timing.
Figure 2 shows the demodulated signals from a single frame of an LTE signal. The channel was flat-faded (no frequency selectivity). It is clear from the two layers of the MIMO signal shown at the top of Fig. 2 that the constellation on the left is tighter, which would result in a lower bit error rate (BER). In a closed-loop system, if the channel characteristics are known i.e., the UE sends channel state information to the eNBthe mismatch in performance can be dealt with, either by loading the layer with better performance with a higher order modulation, or applying precoding to equalize the performance of the two layers, as seen in the lower plots.
LTE TRANSMITTER TESTING
Another requirement for many modern wireless communications systems, including LTE, is the measurement of both broadband and narrowband power. Due to the nature of the downlink and uplink signal characteristics, LTE power measurements typically involve detecting power levels down to the resource element (RE) level, which is one orthogonal frequency division multiple access (OFDMA) or singlecarrier frequency-division multiple access (SC-FDMA) symbol lasting 66.7 ??s on one subcarrier. For such measurements, a spectrum or signal analyzer or vector signal analyzer (VSA) is essential. Power measurements associated with specific portions of the signal often require the digital demodulation capabilities of VSAs.
Careful design approaches are needed to control an LTE transmitter's edge-of-band characteristics, trading off out-of-band attenuation without affecting in-channel performance. This trade off must balance costs (component cost, power efficiency, physical space, etc.) with optimization of the in-channel and out-of-band performance. Requirements for LTE out-of-channel emissions are covered by the adjacent-channel-leakage-ratio (ACLR) and spectral-emission-mask (SEM) measurements, as is the case for testing Universal Mobile Telecommunications System (UMTS) equipment. These measurements are generally made with spectrum or signal analyzers using built-in test routines.
OFDMA signals can exhibit a high peak-to-average-power ratio (PAPR), requiring power amplifiers for eNB units to have a high degree of linearity to avoid producing out-of-channel distortion products. Power amplifiers with high linearity, such as Class A designs, tend to be expensive and lacking in power efficiency. Two complementary methods exist to counteract this challenge: crest factor reduction (CFR), which attempts to limit the signal peaks, and predistortion, which attempts to match the signal to the nonlinear characteristics of the amplifier. CFR attempts to limit the peaks in the signal before it reaches the amplifier. OFDM signals without CFR have RF power characteristics similar to that of additive white Gaussian noise (AWGN), with peak power excursions more than 10 dB above the average power level. Careful use of CFR can substantially reduce peak power requirements while maintaining acceptable signal quality.
Continue to page 2
Predistortion enables the use of amplifier technologies that are both more powerefficient and less costly, although predistortion also adds design and operational complexity. Predistortion maintains in-channel performance while operating in an amplifier's nonlinear region for improved efficiency. This minimizes signal compression so that out-of-channel performance does not degrade at the higher operating level. A number of analog and digital predistortion techniques are available, from analog predistortion to feedforward techniques and full adaptive digital predistortion used in the latest generations of power-efficient base stations. Amplifiers and transmitters using adaptive digital predistortion require test capability for digital input signals and RF output signals.
Analysis of MIMO signals must be a multiple-step procedure to ensure that the root cause of any LTE transmitter problem can be found. When complex digitally modulated signals are to be verified and optimized, it is tempting to go directly to advanced digital demodulation measurements using vector signal analysis. However, it is usually more productive and sometimes necessary to follow a verification sequence that begins with basic spectrum measurements and continues with vector measurements (combined frequency and time) before switching to digital demodulation and modulation analysis, shown in Fig. 3.
While many transmitter measurements are a straightforward matter of connecting the transmitter RF output directly to the input port of an RF signal analyzer and measuring signal characteristics and content, some measurements will require connecting, probing, and measuring at early or intermediate points in the transmitter signal chain. Figure 4 shows a typical transmitter block diagram and the possible ways in which signals can be injected or probed at different points. Analysis begins by performing a series of RF spectrum measurements (Fig. 5, top) followed by vector signal measurements with an Agilent 89600 VSA (Fig. 5, bottom).
The vector signal measurements (Fig. 6) show (left to right) complementary cumulative distribution function (CCDF) for quadrature phase-shift keying (QPSK), 16-state quadrature amplitude modulation (16QAM), 64-state QAM (64QAM), and the AWGN reference curve. The next step in the analysis is to perform digital demodulation. In Fig. 6, trace B shows power versus frequency from a single Fast Fourier Transform (FFT), trace D shows the error summary, trace A shows the in-phase/quadrature (I/Q) constellation, (which shows that the analyzer has successfully locked to and demodulated the signal), trace C shows the error vector spectrum, and trace E shows error vector magnitude (EVM) in the time domain as a function of the symbol rate.
In the case of MIMO transmissions, it may be possible to isolate and measure each transmitter separately, while it may also be that the only access point is to the coupled, precoded signals. Any test configuration should give the most accurate results. The table shows which measurements require single or dual analyzer inputs.
Because of the complexity of modern wireless communications systems, software simulation tools help avoid costly hardware design iterations and speed the design process. To streamline the design process, it is critical that the software simulation tools can work seamlessly with commercial test equipment and test software. Many modern computer-aided-engineering (CAE) software design suites work with the essential test instruments needed for LTE transmitter and receiver testing, including most commercial VSAs and VSGs. As an example, Fig. 7 shows an LTE transmitter and receiver with a faded MIMO channel as modeled by the Agilent SystemVue system-level simulation software.
Combining simulation with test offers a number of benefits. An example is to create a MIMO dual-transmitter source and perform coded BER measurements on a complete MIMO dual- receiver and baseband combination. The transmitter payload can be digital or analog I/Q data, combined with control and precoding, with real-time error analysis provided by comparing the receiver data output with the sent data. Stress-testing the receivers by applying known fading and channel coupling scenarios while measuring real-time BER builds confidence that the design will work under real-world conditions. Figure 8 shows the block diagram of a system designed to perform receiver stress testing using signals with known fading and channel-to-channel coupling characteristics.