Hi -
We have a design with several INA228 devices measuring the power supply voltages feeding our SOC (5 total voltages). We've noticed as we sweep the power supply voltage to our SOC from -10% to +10%, there is a periodic non-linear variation in the INA228 bus voltage measurement. This error has been repeatable across multiple boards and SOCs.
However, the periodic error is eliminated by increasing the conversion time.
I have included a chart showing the differences amongst a few different conversion times (CT) along with measurements from a Keithley DMM. The error appears when the Conversion Time CT= 50 µs. There appears to be very little difference compared to the DMM when CT=540 µs or CT=1052 µs. In all cases, we are averaging 512 samples.
We chose a 50 µs conversion time because we were trying to capture short dynamic changes in the power supply load.
Could you please explain why increasing the conversion time removes this variation?
If this were just noise in the power supply, I would expect averaging to eliminate the noise. On an oscilloscope, I am not able to clearly see this periodicity in the voltage.
Is it possible there is actually this variation as we adjust the power supply potentiometer and 50 µs conversion time accurately captures it? ...or is it more likely this is some sort of aliasing or error in measurement due to the short conversion time?
Mostly, I'm trying to understand if I should simply increase the conversion time and disregard the variation, or if there is actually a periodic error in the voltage.
Thanks,
Tom