The data sheet advises that the maximum differential input voltage that gives linear measurements is 500mV, giving an output of 100uA. If I wish to measure a larger differential voltage, it seems to me that if I add 20k precision resistors in series with In+ and In-, I can now measure up to 2.5 volts for the same 100uA linear output current range. Apart from the obvious issue of degraded common mode rejection from mismatched resistors, are there any issues with doing this?
The X-Y problem: I'm designing a variable high voltage power supply using a linear pass element, and a tracking switching preregulator to minimize dissipation in the pass element. My basic design is to use the linear element to control output voltage directly, and use the voltage across the pass element as the feedback for the switching preregulator. Thus, the preregulator output will track the output of the linear element, plus a constant dropout voltage programmed by the components in the feedback loop.
The power supply output voltage is high enough that I'm looking at using the technique outlined in your TIDU833 application note to pass the measured dropout voltage from the high side pass element down to the low side where the switching regulator lives. I realize I can adapt the technique to work with a normal voltage-output instrumentation amplifier but the vbe of the common base bjt adds output offset which makes it a bit more of a pain so I'd prefer to avoid that if I can.