Can someone tell me how much the input bias current changes when the input voltage exceeds a valid range and saturates the gain stage?
I am using the PGA280 in a multi-channel system (8 input channels; 4 PGA280s, each with two input channels, driving four ADC inputs) with input ranges from +/-10V to +/-50mV, using the PGA280 gain to normalize the input voltages for the subsequent ADC. With this sort of system it is easy to saturate an input; for instance, configure an input for +/-50mV and connect it to a 9V signal. In this configuration, the PGA280 appears to draw considerable current on the INP2 pin, much more than INN2, INP1 or INN1 (I have run both inputs overvoltage, one at a time, and channel 2 seems much more impacted than channel 1.) I have 5K resistors in series with the inputs so there is no chance of exceeding the PGA280 input current limit but in some fault conditions I get more than 2 Volts DC across the resistor on the INP2 pin when driving with an isoloated signal source (9V battery.) The INP2 input pulls the isolated signal source to the positive supply rail.
This may not be a problem at the application level but it shows up when all inputs are tied together and driven with 8 to 10 Volts DC, configuring one input with relatively high gain and the others with a gain of 1/8 - in this case the amplitude measured on all of the channels is reduced, apparently because they all are pulled to the positive rail (this is an isolated system so excessive input bias current can pull the signal source away from ground.)
I am just trying to better understand the behavior and mechanism here so I can determine if this will be a problem for the system that needs to be addressed or if we can ignore it, allowing an overvoltage channel to saturate but still have the other channel produce valid data.