Today, several instrument vendors are offering high-performance PXI RF test equipment. In a world that was once dominated by traditional-rack-and-stack instruments, this industry shift begs the question: “Why?” More specifically, we should ask, Why are engineers using PXI for their RF test needs, and Why did this change occur? I would like to take a stab at answering these questions while providing the context for why PXI has become a de-facto standard for RF instrumentation.

Fundamentally, a PXI system is just a personal computer (PC), but with a few notable distinctions. If you think back to your old desktop tower with PCI cards for various functions, you’re not too far off. In a PXI system, “instruments” or “modules” effectively have a PCI or PCIe interface to a PC. Thus, whenever you want to make a measurement with a PXI instrument, you’ll use either a graphical user interface (GUI) or an application programming interface (API) from the PC environment to control your instrument. For example, if you’re making vector-based measurements like Long-Term-Evolution (LTE) demodulation, the PCI bus will pass a large array of in-phase/quadrature (I/Q) data to the PC (called the PXI controller). That PC, in turn, will apply the appropriate signal-processing algorithms to return a measurement result.

One of the interesting facets of the RF instrumentation world is that nearly all instruments—including rack and stack—have transitioned to using a PC environment. In fact, many off-the-shelf spectrum analyzers are merely PCs running a version of Windows Embedded software. Yet it should be noted that instrument control actually looks a little different when you compare the PXI with the rack-and-stack approach. In the latter, an instrument control system actually uses multiple PCs. One master PC sends SCPI commands over the General Purpose Interface Bus (GPIB), Universal Serial Bus (USB), or local area network (LAN) to the other PC’s, which are instruments. The paradigm is actually slightly different in the PXI environment, where a single PC is used to sequence commands and compute measurements.

Think of it as if you’re deploying your “test executive software” on the instrument itself. While the incarnation of the PC is a little different between the two approaches, one fundamental fact remains: Modern instruments are using PC technology. Given this fact, it’s worth explaining why.

The Need For Speed

Over the past decade, changes in the ways that engineers are using RF test equipment have largely influenced the move to PC-based technology for RF instruments. In the world of wireless standards, we’ve observed an increasingly rapid onslaught in the adoption of new standards. As wireless standards continue to use more complex modulation techniques and wider bandwidths, the signal-processing algorithms required to test these devices also are increasing in complexity. If you look at the number of mathematical operations required to demodulate and measure an LTE-Advanced signal, it is several orders of magnitude more intensive than a second-generation (2G) GSM signal. As a result, measurement times of wireless signals are primarily driven by how capable an instrument is at executing a signal-processing algorithm.

To illustrate this point, consider the timeline of basic Wideband Code Division Multiple Access (WCDMA) error vector magnitude (EVM). This measurement, along with Adjacent Channel Leakage Ratio (ACLR), is one of the two fundamental measurements that engineers might use when testing a third-generation (3G) cellular PA. When measuring EVM, an RF vector signal analyzer is required to acquire a sufficient burst of data. It then executes a demodulation and algorithm.

If one were to configure an instrument to measure EVM, this process would require three basic steps: instrument setup, signal acquisition, and actual measurement. For setup, the instrument might be required to change attenuation or frequency settings. Today, some of the fastest settling RF front ends can accomplish both in slightly less than 1 ms. In signal acquisition, the instrument must acquire a record of I/Q data that is sufficient to enable demodulation of the signal. For WCDMA, this acquisition time is approximately 667 μS—again, less than 1 ms.

Finally, the signal analyzer must demodulate and measure the signal. Today, the fastest PC-based algorithms can accomplish this task in about 18 ms (assuming an EVM-QPSK measurement). When you add the totals, it stands to note that the “signal-processing” portion of the measurement consumes approximately 90% of the measurement time. That number would be even higher for a more complex measurement, such as LTE EVM.

The growing need to increase measurement speed has been one of the primary motivations for vendors to offer instruments in a PXI form factor. In PXI, the instrument front end can be coupled with the highest-performance PC technology. For example, data transfer from an analog-to-digital converter (ADC) to a processing unit is conducted on PCI Express.  In addition, the signal-processing algorithm itself can execute on a multi-core CPU. The benefit of this architecture is twofold. First, the PC is already capable of high-performance signal processing--and at a reasonably low cost. Second, the modularity of the PXI form factor also offers upgradability. Because the “PC portion” of the PXI system is interchangeable, engineers can upgrade their instruments over time. As CPUs continue to improve in performance and capability, the engineer has access to increasingly faster measurement speeds simply by swapping out his or her PXI controller.

Emphasis On Software

While the ability to harness Moore’s Law is one of the biggest benefits of PXI for RF measurements, it’s certainly not the only one. Over the years, we’ve observed an increasing emphasis on software in the instrumentation world. In the old days, a spectrum analyzer produced a display from entirely analog hardware. Today, however, engineers doing more with their RF signal analyzers.  For that reason, the majority of today’s RF spectrum analyzers are actually “vector-based” instruments with I/Q sampling capabilities.  

In fact, RF signal analyzers (basically I/Q digitizers) apply a broad range of signal-processing capabilities to an equally broad range of applications. If an engineer wants to update an RF signal analyzer to test the latest wireless standard, for example, installing a new software package is a simple option. On a similar note, say an engineer wants to measure the linearity of a RADAR pulse. Acquiring the I/Q data and demodulating the phase response is a simple mathematical operation that can be accomplished in software. Even if an engineer wants to solve a traditional measurement problem (like a spur sweep), a vector-based RF signal analyzer can produce the necessary display.  Internally, the instrument simply applies a He or she complex Fast Fourier Transform (FFT) to sampled I/Q data  and then display the spectrum result.

Engineers can solve an increasingly broad range of applications through software manipulation of I/Q data. The flexibility of software-defined PXI RF instruments is therefore an attractive option. In fact, the movement toward “synthetic instrumentation” in the aerospace/defense industry several years ago was entirely based on the idea that a general-purpose RF front end could be redefined to solve various measurement problems through software.

Miniaturization of RF Components

As we’ve talked through several motivations that might cause engineers to want PC-based instrumentation, you may be wondering what has motivated the movement to PXI specifically. Over the past few decades, the communications industry has driven the development of higher-performance ADCs, DACs, and even RF front ends. Each of these “ingredients” of an RF instrument has continued to improve in RF performance, cost, and size over the past decade. Starting several years ago, a best-in-class RF signal analyzer could be designed with high-performance, off-the-shelf components—and in a smaller footprint than ever before. Thus, as RF front-end technology for best-in-class instruments continued to shrink in size, the ability to produce these products in PXI became increasingly attractive.

(As an aside, I think it’s important to note that the existence of off-the-shelf RF and microwave components has leveled the playing field for test vendors to design increasingly high-performance RF instrumentation. A decade ago, test vendors were highly dependent on proprietary technology, such as custom application-specific integrated circuits (ASICs) and microcircuits, to deliver increasingly high-performance instrumentation. Today, however, many of these technologies are now available off-the-shelf. As a result, an increasingly broader range of vendors can build high-performance instruments. The new “instrumentation race” has become more about software than RF front-end performance.

Parting Thoughts

As we look forward to the next decade, I suspect that we’ll continue to see an ever-increasing number of PXI RF instruments available on the market. As I mentioned, this trend is the result of two factors: vendor motivation to provide lower-cost instruments in an increasingly competitive environment and the need to solve emerging measurement challenges, such as test time and software flexibility. I believe that we’ll continue to see the signal-processing capabilities of instrumentation grow over time. Long term, technologies like user-programmable field-programmable gate arrays (FPGAs) and digital signal processors (DSPs) will increasingly be integrated into RF signal analyzers to address the expanding measurement needs of engineers.

So next time you walk into the lab and see your colleague using a PXI RF signal analyzer instead of the old box, you’ll know why.  After all, doesn’t everyone use PCs anyway?