Skip to main content
Optimizing Calibration Substrates for Next-Generation On-Wafer Probe Measurements and Practical Comparison of Modal vs SOLR Calibration
The rapid adoption of cloud computing and AI has significantly increased the demand for high data-rates, pushing the communication technologies to mm-wave and sub-THz frequencies. At these frequencies, accurate calibration becomes critical due to the enhanced sensitivity of measurements on account of probe coupling, radiation and environmental effects as well as EM parasitics. To ensure accurate characterization of on-wafer devices, optimized calibration standards are essential to reduce the losses and crosstalk between the probes and wafer during the calibration routine. The conventional 12-term error model effectively corrects single-ended measurements but is insufficient for differential systems, as it neglects mode conversion and asymmetry between common-mode and differential-mode signals. In contrast, the 16-term calibration algorithm accounts for crosstalk terms, and mode conversions including differential-to-common and common-to-differential, thereby capturing the complete error space in balanced measurements.