International Journal on Electronics & Communication Technology (IJECT)
The Time-Interleaved Analog-to-Digital Converter (TI-ADC) is an efficient approach to systems requiring very high sampling rate with medium to high resolution. However, interchannel mismatches and process variations appear as a main bottleneck leading to substantial degradation in global TI-ADC performance. In this paper, an adaptive compensation technique is proposed for improving the overall performance in the presence of offset mismatch independent from either non-ideality origins or input signal. Proposed method is based on an adaptive filter that verified through simulating a two-channel TI-ADC architecture.