This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC16DV160 Input Interface

Other Parts Discussed in Thread: ADC16DV160, LMH6521, LMH6517

We’re looking at the ADC16DV160 input interface and have a few questions.

An ideal setup for us would be to use a relatively low power, high linearity ADC driver with the ADC16DV160 ADC. Our present driver has low output impedance, but needs to see approximately 200ohm load in order to maintain its high dynamic range. Most of the applications circuits for ADC16DV160 show approximately 50 ohm in parallel with 42pF compensation capacitance at the ADC input, which loads the driver significantly.

We would appreciate it if National / TI could help us out with the following questions:

a)       How much does the ADC16DV160 performance suffer with a circuit optimized for 200ohm impedance?

b)       Is there more information available for optimizing the capacitance value at the ADC input? Currently all the circuits show 42pF capacitance, what is the lowest value of capacitance that would enable us to have 80+dB SFDR?

  • There is a tradeoff between linearity performance (specifically H3) and the components at the ADC input. Better H3 performance is generally achieved with lower impedance at the ADC input, though the exact relationship may vary across input frequency and different input network types because the performance is heavily influenced by switching noise injected back out of the ADC input. Using lower termination resistors and higher capacitance at the ADC input reduces the influence of the switching glitches, thereby improving distortion performance.

    Still, I have personally designed a reference board (SP16160CH1RB) with the ADC16DV160 that had a 200 ohm termination at the ADC input and a bandpass filter that achieved ~85dBFS H3 for a -1dBFS input at 192MHz. This design used a filter that performed an impedance transform, so the source resistance was 50 ohms, but the ADC still saw the equivalent 200 ohms at the filter output which would resemble your application. We often run into the imposing challenge of designing a high frequency bandpass filter at 200 ohms (because LC discrete components become unrealizable), so using a low impedance output ADC driver (such as the LMH6521) greatly simplifies the design by allowing a lower impedance filter but providing great linearity performance at high frequencies. This filter was intended to minimize voltage loss through the interface.

    http://www.national.com/rd/RDhtml/RD-179.html

    Your question is very related to the required bandwidth of your application. Aside from just using lower impedance, the rules of thumb here are that you will achieve better, more consistent (across frequency) distortion performance by:

    - limiting the BW as much as your application will allow using capacitance at the ADC input

    - Splitting your capacitance at the ADC input between common-mode and differential configurations (i.e. for 20pF total differential, place 10pF differentially and 20pF from each input to GND)

    - minimizing the trace lengths (transmission lines) between the driver and the ADC.

    - Build in an impedance-matched pi-network attenuator at the ADC input, such as 3-6dB, if your application will allow. This is particularly effective in applications with long transmission lines.

  • Hi Joshua,

    Thank you so much for your detailed reply, all the information was very helpful and much appreciated!

    It looks like the 200ohm impedance is possible at the ADC input with good performance, which is very reassuring.
    We also need to get 200ohm impedance seen by the driver, so that the driver performance can match the ADC.

    The driver is pretty much 2Vpp limited for compression / linearity reasons, so we need to avoid voltage loss across the interface as much as possible to get the maximum dynamic range. We will definitely split capacitance between diff and comm modes.

    The matching / filter network in RD-179.pdf seems to present ~50ohms to the LMH6517 driver.
    Did you find that the lower impdance affected driver linearity?

    Thank you!
    Pavel

  • The filter in the SP16160CH1RB design was specifically for the low impedance output driver because the driver requires series resistors at its output to maintain stability. With 50 ohm source resistance, 200 ohm load resistance, and a filter that performs an impedance conversion, one can achieve 0 dB voltage loss (or potentially even gain) through the whole driver-to-ADC interface network. For this particular design, the LMH6517 is driving ~100 ohm total looking into the series resistors which is somewhat of a sweet spot for performance.

    If you use a driver with 200 ohm output impedance, then your filter will be 200 in, 200 out. In that case, you do not require series resistance (that causes loss) so the only loss you should is expect is 0.5-2 dB from the inductors depending on the filter order and topology. If you can only drive 2Vpp-diff, and there is 1dB loss through the filter, then you should only expect to drive the ADC to ~1.8Vpp-diff which is 2.5dB below the 2.4Vpp full scale range of the ADC16DV160. You do have the option of reducing the ADC full scale to 2.0V with the benefit of improved distortion (H3 ~1dB better) but at the cost of reduced obtainable SNR (~1.6dB worse).

    Regards, Josh