This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

LMH6515 Input Offset

Here is my design: the differential input pins V+ is connected to previous stage's output; and V- is connected to a VCM=1.4V, which is a AC ground. In my application, the input of the VGA must be DC couple. Then there is a problem: the previous stages may generate DC offset, so the DC voltage of the V+ may differ from VCM.  Then I found if the input offset=20mV, the circuit functions is still ok. But if the input offset is increased to 70mV, the circuit stops function properly.

So, my question is: how much input DC offset is acceptable to ensure the circuit funtion properly. 

  • The input to the amplifier is the voltage difference between the inputs.  So, as the input offset increases the input signal is increasing.  At some point you have a large enough signal to drive the output to full scale and then there is no signal headroom left for the desired signal.

    The other thing to watch is that you keep the input and output pins biased to the voltage ranges that are specified in the datasheet. This example has the output pins biased to a voltage that is not ideal.  The output voltage can swing from  approximately -3.5V to 6.4V.  In the datasheet all application circuits are shown with inductive bias circuits that put the output common mode at 5V. 

    One option to fix this circuit would be to use a higher V+ so that the resistive bias puts the output common mode closer to 5V.

    On the input side it you could set up an op amp in a servo loop to match the unused input to the voltage of the offset input.  Set the frequency response of the servo loop to 10 Hz or lower and it will not interfere with the AC signal.