I am interested in using the ISO224B but would like know how the input bias current changes with input voltage (not supply voltage). Do you have any charts or numbers you could provide that indicate how input bias current changes as the input goes from -12V to 12V? I presume the input bias current also increases dramatically when the input voltage approaches "Vclipping".
My application involves measuring a high voltage (+/- 1.2kV max) signal which needs to be divided down. Therefore, the input bias current will impact the measurement due to the divider impedance. I'd prefer not to use an input buffer amplifier, as that would require an additional (negative) supply rail since my input signal is AC voltage.
Although input bias current vs. input voltage is not a common graph supplied for op-amps, the nature of the ISO224 makes it particularly important for this part. In a traditional op-amp circuit with feedback and a fixed reference level, the actual voltage at the +/- inputs of the op-amp remains constant, even as the input signal changes. Because this voltage remains constant, the input bias current would also remain fairly constant and could be canceled out by a variety of methods.
However, the ISO224 only has a positive input with no external feedback network. Therefore, when the input signal changes, the actual voltage level on the input to the ISO224 also changes. Further complicating matters is the fact that when the input signal is negative, input bias current could only possibly flow *out* of the IN pin. When the input signal is greater than the VCAP voltage, then input bias current could only possibly flow *into* the IN pin. This makes it difficult to cancel out the input bias current and understanding how the input bias current changes with input voltage is important.
Thanks