Broadband Mismatch Calibration for Time-Interleaved ADC Based on Linear Frequency Modulated Signal

2021 
Time-interleaved analog-to-digital converters can significantly increase system sampling rate, but channel mismatches, such as gain mismatch, offset mismatch and phase mismatch, can cause great error in the system. The problem of broadband mismatches calibration for multichannel time-interleaved analog-to-digital converters system will be discussed in this paper by using the foreground calibration technique. Essentially, the calibration can be divided into two steps: First, all the mismatches between analog input and digital output will be measured by taking a linear frequency modulated signal as the reference signal; Next, a set of finite-impulse response filters will be calculated based on those mismatches to generate frequency dependent mismatches compensation, so that the signal to noise and distortion ratio, and spurious-free dynamic range will not be limited by the channel mismatches. There are two advantages of the proposed method when compared with previous methods. The complexity of error estimation can be reduced dramatically and high precision error estimation can also be achieved. In order to evaluate the performance, a series of experiments were conducted in a two-channel time-interleaved analog-to-digital converter system. The results show that the spurious-free dynamic range is improved by more than 10dB.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []