Hello,
I am using the ADC of the digital signal processor TMS320F2812. Because the sampled voltages are much higher than the upper limit of 3 V, I plan to use voltage dividers to step them down to the permissible level. The voltage divider should conduct as little current as possible and as much as necessary to provide the necessary current to the inputs of the ADC.
Regarding the input behaviour of the ADC inputs, the data manual (SPRS174N) contains two pieces of information: the typical and maximum input leakage currents, given as 5 and 8 µA respectively, and the analog input impedance model, Fig. 6-38. From the latter however it looks to me as if current only flows into the ADC to charge the capacitors Cp and Ch. Therefore the input current would be zero if the sampled voltage is constant or changes sufficiently slowly. I intend to sample at 800 kHz, and according to the Hardware Design Guidelines (SPRAAS1B), the voltage across Ch should then always remain close to the input voltage. Is this the case? If so, the voltage divider can be assumed to be unloaded in steady-state, correct, and be very highly resistive?
Where does the given input leakage current come into play? Does one need to take it into account when choosing the resistors for the voltage divider?
Regards,
Adrian