Calibration Can Make All the Difference

RSS

Analyzer

During lunch recently with Ulrich Rohde of Synergy Microwave Corp., the topic of testing different components came up—in particular low-noise oscillators. Customers and visitors to Synergy know that the company is as well equipped with test gear as any small company in the RF/microwave industry, even to the extent of installing a Faraday cage for measurements isolated from the plethora of electromagnetic interference (EMI) and radio-frequency interference (RFI) around us. But Ulrich pointed out that having all the test equipment was no advantage unless the equipment was properly calibrated.

Instruments that are out of calibration will provide inaccurate test results. In production testing, this can result in poorly performing components seemingly meeting their performance requirements, then being shipped to a soon-to-be unhappy customer. It can also resulted in perfectly good products apparently failing to meet their required specifications and being subjected to unnecessary rework.

Calibration is a means of ensuring that a test instrument is performing within its required limits, typically by comparing it to a reference standard or a similar instrument that is at least 10 times more accurate than the instrument under calibration. Calibrations of certain types of analyzers, like microwave vector network analyzers (VNAs), can also be performed with calibration kits available from a number of suppliers. These kits typically provide the passive components to perform a short-open-load-through (SOLT) calibration of a VNA prior to making measurements in the laboratory or on the production floor, but they do not equate to the instrument calibration performed by qualified test labs.

There is no single acceptable calibration cycle, such as an annual calibration. It depends on the instrument and the schedule that the instrument’s manufacturer has recommended. Newer test instruments include self-calibration functions that provide a certain amount of correction for components within a test instrument, such as oscillators, that will drift over time. Such functions can extend the required calibration cycle of an instrument, within the limits of the self-calibration circuitry’s correction range. One excellent resource is “Setting and Adjusting Instrument Calibration Intervals,” an application note from Keysight Technologies.

Some test equipment owners may feel that calibration is a “necessary evil” and that the test gear should have been better designed and built in the first place to eliminate the need for calibration cycles. Companies testing instruments used to qualify products for military customers may require calibration to ANSI/NCSL Z540-1/MIL-STD-45662A requirements and may bemoan the time that their instruments are away due to the need for calibration. Such calibration procedures are quite detailed and must be traceable to standards set by the National Institute of Standards and Technology (NIST) for a particular type of instrument. For most companies, this is best left in the hands of firms specializing in such calibration services.

Admittedly, the need for calibration may remove an invaluable instrument like a VNA or digital storage oscilloscope (DSO) from a “well-oiled” production line. But the time spent in calibration is time well spent compared to time lost from unnecessary rework on perfectly good products, and the possibility of shipping poorly performing products to a custom with the resulting loss of company credibility. Consider it time invested in peace of mind.

Looking for parts? Go to SourceESB.

Please or Register to post comments.

What's Measuring Progress?

Blogs focusing on test & measurement.

Blog Archive

Sponsored Introduction Continue on to (or wait seconds) ×