Customer would like to know how to do the measurement for the supply-voltage rejection ratio.
Could you offer the guideline and setup of this one?
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Note: Reply edited to include link
You probably already know this, but the best way to repeat the datasheet results is to match the test conditions given to the best of one's ability.
Generally speaking, what you'll want to do is vary the supply voltage and then see the resulting change in offset voltage. Given that the test specifications call for an output of 0V, I would guess that the way they are measuring the offset voltage is by applying a small voltage across the inputs of the op amp and, when the output hits 0V, the voltage across the input is taken to be the offset voltage. Then the supply voltage is changed by some step, say 500mV or 1V, and the offset voltage is measured again.
Note (1) at the bottom seems to suggest that the test was run in the open loop, especially given that there is no gain specification. However, I would think that the best way to run this would be in the closed loop in a unity buffer gain configuration. I'm putting a link below for your reference. The link will direct you to a presentation from TI Precision Labs and slide 3 is on measuring "DC PSRR," which has the same definition as "KSVR."
Please let me know if I can be of any further assistance.