This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Op Amp output voltage linearity versus supply voltage

I have a non-inverting op amp that is measuring current through a current sense resistor and is then amplified by the op amp before being measured by an ADC. The quad op amp package is powered by a single power supply that can have a voltage range from 7 to 26 volts. When I have no current flowing through the current sense resistor I have measured the op amp output voltage as I vary the power supply across the 7 to 26 voltage range. The output voltage varies from 123 mV to 145 mV respectively even though the input voltage is constant.

What mechanism causes this voltage to change as the power supply voltage changes?

What specs do I need to look at to improve this linearity?

My circuit is similar to this ... with a current sense resistor tied between Vin and ground. Vin is left open for this test.