Machine Learning Based Image Calibration for a Twofold Time-Interleaved High Speed DAC

2019 
In this paper, we propose a novel image calibration algorithm for a twofold time-interleaved DAC (TIDAC). The algorithm is based on simulated annealing, which is often used in the field of machine learning to solve derivative free optimization (DFO) problems. The DAC under consideration is part of a digital transceiver core that contains a high speed ADC, microcontroller, and digital control via a serial peripheral interface (SPI). These are used as tools for designing an algorithm which suppresses the interleave image to the noise floor. The algorithm is supported with experimental results in silicon on a 10-bit twofold TIDAC operating at a sample rate of 50 GS/s in 14nm CMOS technology.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    1
    Citations
    NaN
    KQI
    []