This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM69: ADC Input Impedance of MCU_ADC0

Part Number: AM69
Other Parts Discussed in Thread: J784S4XEVM,

Tool/software:

In our design we have to voltage divide an input into the ADC as it is above the reference voltage. We are trying to determine the magnitude of the resistors to use based on the input impedance of the ADC. From the AM69 datasheet it states the input impedance is [1/((65.97 × 10^-12) × fSMPL_CLK)] and that the fSMPL_CLK frequency is 60MHz. Evaluating this gives an impedance of 252 Ohms. But we also measured on a J784S4XEVM EVM board and the calculated impedance was around 1.3M. What is the actual input impedance of the MCU_ADC pins?

  • Sorry, the measured impedance on the EVM board was 3.52Meg, not 1.3Meg

  • Hi Chris,

    Thank you for posting your question! Was the 3.52M Ohm measurement taken while the ADC was actively sampling at 60MHz or was the ADC idle / powered down?

    I look forward to your reply.

    Regards,

    Jeff

  • This was calculated based on the value the ADC measured and the value of the resistive dividers used. So it is based off of what the ADC actually read when it sampled the value

  • Hi Chris,

    Thanks for the response! Since you are trying to select the right values for your resistor divider, I think you should focus on the input leakage current rather than the dynamic sampling impedance. Designing the divider using the ADC's leakage current is generally more reliable for ensuring accuracy over a wide range of conditions. The leakage represents an always present load while the sampling capacitor's load is conditional and momentary. 

    First, there are a few things that you will want to consider. Temperature, input voltage, and acceptable error. Input leakage increases dramatically as temperature increases. It also increases as your input voltage nears Vref (1.8V). For the acceptable error, that depends on what your needs are. 

    For example, Lets say your input voltage is 5V, you are dividing that down to 1.8V, and your acceptable output error margin is 5mV. The datasheet provides the information that at the worst case scenario (125 degrees Celsius at 1.8V input), the input leakage current is 24uA. Using Ohm's law, lets find the maximum source resistance the ADC input can see before leakage causes more than 5mV of error...

    R(th) = V(error) / I(leakage) = 5mv / 24uA = 208.333 Ohms

    Since you are stepping down from 5v to 1.8v, using the voltage divider equation, R1 = 1.77*R2

    Doing the math for R(th) with the appropriate ratio of R1 and R2, we find that R(th) approximately equals 0.6397*R2. With the max R(th) = 208.3, this gives a maximum R2 value of approximately 325 Ohms and an R1 value of approximately 576 Ohms.

    Now if you want to leave some room, you can choose a standard size of 309 Ohm for R2 and a 549 Ohm for R1, which gives you a V(error) equaling to approximately 4.74mV.

    With all this being said, another thing to consider is the tolerance of the resistors that you are using. For example, using a 0.1% instead of a 10% tolerance resistor will greatly change the error being introduced. 

    I hope this helps, feel free to ask any questions that you may have!

    Regards,

    Jeff

  • Jeff, thank you for the detailed explanation. Unfortunately, the voltage we are trying to read comes from a battery with a max dissipation rate of 250uA. Therefore we will have to use rather high resistors and have a high error unless we buffer with an op-amp. But this was helpful in determining that we will most likely need an op-amp