Traditional methods of performing high-speed digital communications transmitter compliance testing can provide misleading results compared to newer approaches.
Verifying the performance of a digital communications system was once easy. When entire networks were installed and owned by a single company, and the system worked, extensive testing of the subcomponents was unnecessary. But in this age of more complex networks, with components hailing from many different sources, proper compliance testing must ensure that system-level specifications are met when all components are connected. In particular, some new approaches can help evaluate transmitter performance in high-speed optical communications networks.
Basically, a communications system consists of a transmitter, a channel, and a receiver. The channel could be a metallic or fiber cable, free space (in a wireless system), or a large backplane printed-circuit board (PCB). Setting the specifications for each component starts at the system level. Once the overall system performance is known, the burden of performance is then allocated to the subcomponents. In many cases, the transmitter and receiver are contained within a single module, although each must be evaluated separately.
To set the transmitter specifications, it is important to know how good the receiver and channel will be, and vice versa for the receiver specifications. A starting point for this iterative process might use performance levels from an earlier-generation system. When a new standard is being developed, it likely will employ components that have yet to be fully developed, as performance levels usually make a significant jump in the next generation system. This can present some difficulty since the component developers may be reluctant to make large investments into new components until a standard, and the subsequent component specifications become stable. This can lead to a slow and tedious process as the standard and the components evolve together.
Nevertheless, a transmitter must be capable of operation with the lowest performing receivers and channels deemed acceptable. If a receiver needs a minimum level of power to achieve the system bit-error-ratio (BER) target, this will be used to determine the minimum output power of the transmitter. If the receiver can only tolerate a certain level of jitter, this will be used to define the maximum acceptable jitter from the transmitter. In a digital communications system, the receiver makes logic decisions on each incoming bit, so the shape of the transmitter waveform must also be specified.
Optical communications systems provide a good example for a transmitter test strategy. Key requirements for the transmitter are based upon the receiver needing a wide separation between logic 1's and logic 0's. Also, a consistency in the time location of the transitions between logic 1's and 0's (low jitter) is necessary. This helps ensure that decisions are made where there is little chance for a mistake. An eye diagram is the common method to view the transmitter waveform. In the eye diagram, all the various combinations of data patterns are superimposed on one another on a common time axis usually less than two bit periods in width. Good amplitude separation and low jitter is seen as an open eye diagram (Fig. 1). To verify the openness of the eye diagram, an "eye-mask" test is performed. Polygons are placed in and around the eye diagram, indicating where the waveform may not exist. Inadequate wave shapes are detected when the waveform crosses or violates the mask.
For many years, the concept of a reference receiver has been the foundation of standards based optical transmitter test. A reference receiver is an optical-to-electrical converter with a fourth order Bessel-Thomson frequency response. The –3-dB bandwidth is set to a frequency of 75 percent of the transmission data rate. For example, if the transmitter is to operate at 10 Gb/s, the reference receiver bandwidth will be 7.5 GHz.
A reference receiver is used to provide a consistent technique for the analysis of the transmitter waveform. If the bandwidth of an oscilloscope system attenuates the frequency content of a signal, it will not provide a completely accurate representation of the signal waveform. In almost all cases, a bandwidth less than the data rate will significantly alter the shape of a digital communications waveform. In Fig. 1, a laser transmitter is measured with a 10-GHz optical oscilloscope and with an oscilloscope with a 1.25-Gb/s optical reference receiver (938-MHz bandwidth). With the signal viewed in the eye-diagram format with the 10-GHz bandwidth oscilloscope, significant overshoot and ringing is observed (a common phenomenon with high-speed laser transmitters). When the oscilloscope is configured as a reference receiver (938-MHz bandwidth), the high-frequency content of the signal is suppressed and the signal appears to be very well behaved.
The use of a reference receiver seems counterintuitive. It appears to clean up the true behavior of the laser and possibly make a bad device look good. This is where it becomes important to take a step back and remember what the intent of the testing is. It is not to precisely characterize the behavior of the laser. Rather, the test is intended to determine how well the laser will interoperate with a receiver in a real communications system. System receivers will not have infinite bandwidth. They will have just enough bandwidth to correctly differentiate a logic 1 from a logic 0. The ideal bandwidth for this will be approximately 75 percent of the data rate ! Thus, the reference receiver provides a good representation of the signal from the perspective of the system receiver. A reference receiver also provides consistency in test. If all such tests were performed with a specific measurement system bandwidth, results should not vary from test system to test system.
In addition to the shape of the waveform there are other important parameters that define a good transmitter. Signal strength is also important. Every communications system will have noise. The signal must be strong enough such that when this noise is added to the transmitted bits there will still be a large separation between logic levels at the receiver decision circuit.
Early optically based high-speed communications systems often spanned large distances. Repeaters were required to overcome channel attenuation. It was important to maximize the available power from laser transmitters. Consider a transmitter that sent logic 1's at a level of 1 mW and logic 0's at 0.1 mW. Consider another transmitter that sends 1's at 1.5 mW and 0's at 0.6 mW. The separation between logic levels is the same in both cases at 0.9 mW. However, the latter transmitter consumes significantly higher power. One measure of communications efficiency is to measure the ratio of the 1 level to the 0 level. This is called extinction ratio (ER). The first laser has an ER of 10, while the second has an ER of 2.5. Thus, extinction ratio is used as an indicator of how well available laser power is converted to modulation power. High extinction ratios are usually achieved by forcing the 0 levels close to a no power state. Historically, it has been difficult to measure high extinction ratios with high accuracy due to the small signal levels involved and the imperfections of instrumentation. New calibration techniques have been developed to allow more accurate testing of high ER transmitters.
As laser transmitter technology improved, lower-cost transmitters became available. High-speed optically based local-area networks (LAN) became practical. Due to the relatively short spans of such systems, laser efficiency is no longer critical. However, it is still important to maintain a good separation between 1's and 0's. This can be measured directly as the "optical modulation amplitude" or OMA. In the examples above, both lasers have OMA values of 0.9 mW, and for the LAN, both might be considered equally good.
As speeds increased beyond 1 Gb/s, the timing stability of signals became more difficult to manage. Timing jitter, observed when signal edges are not consistently located, leads to decision circuits not making their decisions in the center of the bit. As edges drift toward what should be the ideal decision time, BER is degraded. Transmitter jitter must be controlled to manageable levels. Once again, the capability of the receiver dictates what this level should be.
Receivers require a clock to time the decision process. Many receivers derive this clock directly from the incoming data stream through some form of clock extraction circuit. The clock extraction process also provides some tolerance to transmitter jitter. The receiver clock extraction circuitry can track and follow jitter in the incoming data stream as long as the jitter is not too fast. Typically, if the rate of the jitter is within the loop bandwidth of the clock extraction circuit, the receiver will be able to tolerate the jitter.
If the receiver is tolerant to lower-rate jitter, it would not make sense to reject transmitters that have jitter at these low rates. In recent years, communications standards have been designed to account for this. The oscilloscope used to measure jitter is specified to have a high-pass jitter function so that test results are not impacted by the presence of low-frequency jitter. The easiest way to produce a jitter high-pass function is to derive the oscilloscope triggering from the observed waveform similar to how the receiver derives its decision circuit clock. This is often referred to as "Golden PLL" testing. A clock recovery circuit is built into the oscilloscope.
Figure 2 shows that when the test system employs the correct Golden PLL bandwidth, the test system mimics the system level receiver and eliminates low-frequency jitter. Both measurements are of the same signal, but the lower waveform, using the clock extraction circuit with the correct loop bandwidth provides a better assessment of the waveform from the perspective of the receiver with which it will be paired.
Efforts are currently under way to transmit signals as fast as 10 Gb/s over channels where dispersion dominates system performance. The modal dispersion for installed bases of multimode fiber with distances in the 200-to-300-m range can completely close the received eye. Intersymbol interference due to low bandwidth channels produces a similar result when transmitting a 10-Gb/s signal across a large backplane made of FR-4 type PCB material.
Advanced communications techniques will be required to overcome the large signal dispersion problem (Fig. 3). Receivers will likely have equalization schemes to compensate for the impairments caused by the channel. However, this complicates the definition of what an acceptable transmitter might be. Eye-mask testing may prove to be meaningless if the eye at the output of the channel is closed no matter what the signal quality is going into the channel.
One approach being used in IEEE 802.3aq (10-Gb/s transmission over FDDI grade fiber) is to measure how much equalization can be performed to correct a transmitter's response. The transmitter waveform is captured and then run through a virtual channel model which simulates actual fibers. This signal is then passed through a virtual finite length equalizer. The equalized signal is compared to the quality of the signal if it had been passed through an ideal equalizer. The general theme for transmitter test is maintained. That is, the transmitter is tested from the perspective of the receiver. However, in this case, the receiver is highly sophisticated using both linear and decision feedback equalizers with several signal taps. An example of a dispersed waveform and a virtual equalizer correction is shown in Fig. 3.
In summary, testing transmitters has evolved to accommodate the changes in system architecture and performance. The general approach of viewing signals from the perspective of the receiver has stood the test of time, but is becoming more complex as speeds increase and channels are pushed to operate well beyond their original expected performance.