This is a somewhat unconventional application. We have a sensor that puts out an analog signal that ranges from 0.8v to 2.8v. We need to digitize the actual voltage, so we can't AC couple with capacitors or transformers. Using the differential input "as intended" doesn't really seem to work here.
To get the full range of the ADC, it would be best to level-shift the sensor output, subtracting off 0.8v. That would achieve a 0-to-2 Vpp range. Ideally, because of space and power restrictions, it would be great to eliminate an opamp by exploiting the differential input of the ADC.
Would it be a good idea to simply put the full range signal into the positive input, put +0.8v dc into the negative input, and get a subtraction that way? There is nothing in common between the inputs, so what is the best way to treat the Vcm I/O on the ADC? All channels would have the same 0.8v offset, so would it maybe be better to use Vcm in some way to get the offset?
We have an EVM setup and don't mind hacking it to do experiments, but there's no point in doing something foolish and destroying the DUT.