The datasheet of this part shows that it has protection diodes on the signal inputs, which have a VF of 0.35V. The maximum ratings say "Input current to any pin except supply: -20mA to 20mA" It also says "Analog input voltage: –0.3V to VDD +0.3V"
I have seen similar absolute maximum ratings before where both maximum voltage and current were given, with a comment saying that some overvoltage is fine as long as the current doesn't get too high, but this isn't explicitly said here.
My input signal is in the range of 0V to 0.8V at the times I want to measure it, but between those times it regularly goes down to -0.8V. Currently I plan to protect the ADC from this negative voltage with an additional OPA2376 which I understand is tolerant of over/undervoltage (indicated in the way I mentioned above), working from a 0V/5V supply so that it can't produce negative voltage at the output.
However, this has the drawback that I can't measure all the way down to 0V, even though the op amp is Rail-to-Rail. I don't actually need to measure that far down as part of my application, but it would be great for easy calibration of system offset. The extra amplifier also adds its share of errors and signal delay (which isn't really a big issue, but less of that is always better :))
Do you think it's OK to put negative voltage to the analog inputs as part of normal operation, as long as I don't exceed the current limit? Given the RC network I already plan to place at the input, the current should be 1mA maximum under normal circumstances, and that only for 100µs each millisecond.