A Timing Mismatch Background Calibration Algorithm With Improved Accuracy

2021 
This brief presents a novel timing mismatch background calibration algorithm for time-interleaved (TI) analog-to-digital converters (ADCs). It can calibrate an arbitrary number of channels with an arbitrary input frequency. It also increases the calibration accuracy by applying the autocorrelation functions with an expanded interval. Besides, the proposed algorithm effectively prevents the small derivative values in the correlation difference from degrading the skew estimation accuracy. Compared to prior works on calibration, this work has at least five times better detection accuracy when the frequency of the input signal is close to the Nyquist frequency. This is without the need for calculating the high-order statistics. Finally, we simulate a four-channel 12-bit TI ADC with non-ideal effects added. Simulation results show that the proposed algorithm increases the signal to noise-plus-distortion ratio (SNDR) and spurious-free dynamic range (SFDR) from 35.5 and 40.0 dB to 63.3 and 84.6 dB, respectively, when the input frequency is close to the Nyquist frequency.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    0
    Citations
    NaN
    KQI
    []