Time-interleaved ADC

Time interleaved (TI) ADCs are Analog-to-Digital Converters (ADCs) that involve M converters working in parallel.[1] Each of the M converters is referred to as sub-ADC, channel or slice in the literature. The time interleaving technique, akin to multithreading in computing, involves using multiple converters in parallel to sample the input signal at staggered intervals, increasing the overall sampling rate and improving performance without overburdening the single ADCs.

  1. ^ Manganaro, Gabriele. Advanced Data Converters. Cambridge University Press. pp. 108–113.