Tool/software:
Hi,
this is a follow-up of the previous post I made about some strange LM5066i telemetry behaviour.
It has been quite a while since then, but in the meantime I have received the LM5066i EVK. To be comparable to our own design, I replaced the sense resistor with a 47mR resistor and increased Rpwr and Ctimer.
When I use that setup, I see basically the same behaviour as with our own design. When I increase the current in constant steps, I see ranges where the Iin raw ADC value does not change for 2 or three mV of voltage across Rsense.
In the low load range, the Iin value measured has a very big variance. So for example in this picture I have a current of 150mA, which corresponds to a voltage across the sense resistor of about 7mV. CL is set to 26mV, so 7mV means we're at about a quarter of the range here, and what I see is something like this:

So here I'm far off +/- 10% accuracy, which would be acceptable.
The variance gets less when I use bigger currents. This is for 300mA, corresponding to 14mV across Rsense:

So here we're around +/- 10% of the expected value, that would be OK, but also not really great.
So the question is: Is it expected that the accuracy of the telemetry values is so highly dependent on where in the ADC measurement range we are?
Is the recommendation to make sure if we want decent telemetry, we have to make sure to be in the upper half of the ADC measurement range?
We have low current requirements, but would like to cover a wider range of currents for measurement. If the lower ADC measurement range is basically not usable, that makes it difficult to find a value for Rsense that will cover everything. Meaning we would have to use different resistor values for different products.
