This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F28377S: why internal ADC's sampling frequency influence the stability of input signal

Part Number: TMS320F28377S


Hi,

Customer connect CD4052B with F28377S's ADC, and when sampling frequency is 30kHz or 40kHz, the output of CD4052B will decrease by 0.1V. That will influence the accuracy a lot. do you know what kind of reason will lead to this issue?

  • Hi Yuan,

    What is the total impedance from the source to the ADC?  This would include the MUX on resistance and 10 ohm resistor, but also the impedance between the source and the MUX.  

    Is this at the beginning of each sample, or does the voltage droop over time proportional to the sample rate?

    In the first case, you should look at the "Choosing an Acquisition Window Duration" section of the device TRM and the "ADC Input Model" section of the device datasheet and use the equations to determine the necessary S+H time.  If the signal can settle down to 1/2 LSB during the S+H time, then there should not be any settling error.

    In the second case, usually you are trying to use "charge sharing" to make up for a high input impedance. You'd usually want a capacitor at the ADC pin that is at least (2^(N+1))*(Ch), which for 12-bit ADC would be in the ballpark of 4096*2*14.5pF = 120nF.  However, when using charge sharing you end up with a sample rate limitation, because the external capacitor needs to be recharged in-between samples.  I don't think charge sharing will work very well with an external mux.  

    Overall, things are further complicated by the external mux.  Are you changing the external mux select before each sample?  In this case, 10nF seems like a huge amount to have on the output that needs to be charged to the new voltage (unless the sources driving the mux are very low impedance).