Compensating Frequency Response Mismatches in Time Interleaved Analog to Digital Converters
Time interleaving of multiple analog-to-digital converters by multiplexing the outputs of (for example) a pair of converters at a doubled sampling rate is by now a mature concept - first introduced by Black and Hodges in 1980. Time interleaving of ADCs offers a conceptually simple method for multiplying the sample rate of existing high-performing ADCs. Time-Interleaved Analog-to-Digital Converters (TIADC) require mismatch calibration to achieve high signal-to-noise ratios. In this paper, the authors present a new blind technique for two channels time-interleaved analog-to-digital converter's with the improvement of frequency mismatches as well as time offset and bandwidth mismatches. The proposed method overcomes the limitation of traditional gain-timing mismatch model.