Hi,
Is there any reason I shouldn't use a source impedance of several hundred ohms with a TI ADC?
The TI datasheets typically show ADCs being driven from a 50ohm or 100ohm source, but the actual input impedance (without external resistors) of the ADC input is a few k ohm (E.g. ADS41B49 input is 10k at DC, 3k at 200MHz).
The few pf of input capacitance will become more significant, so I'll have to be careful about that. Using a buffered device should eliminate input switching noise, so that should not be an issue.
My motivation is that my input signal is fairly weak. I could boost the signal voltage by step-up in an input transformer; this will increase the impedance seen on the ADC side of the transformer, hence my question about impedance.
Or would it be more sensible to just use a separate amplifier?
Cheers,
R.