This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Why use bidirectional TVS on thermocouple interface?

Hello,

In the TI circuit note TIDU574 (http://www.ti.com/general/docs/lit/getliterature.tsp?baseLiteratureNumber=tidu574&fileType=pdf&keyMatch=TIDU574&tisearch=Search-EN-Everything?) figure 1, bidirectional TVS diodes are used on the two thermocouple connector pins. What's unclear to me is why they need to be bidirectional. The wires are biased to 3.3V / 2 and clamped between 0 and 3.3V. The ADC analog supply is unipolar.

I have an implementation similar to this where the whole ADC circuit is electrically isolated (inductively). Does that make a difference to this question?

Thanks, Ross

  • I should also have mentioned that the biasing resistors appear to not be correctly drawn in figure 1 (in case that causes any confusion). Figure 57 (the actual schematic) is correct though.
  • Ross,

    The design documentation states:

    When designing with thermocouples, it is important to understand that thermocouples are bipolar, which

    means thermocouples can produce a positive or negative voltage depending on whether or not the

    measured temperature is higher or lower than the system temperature, respectively.

    So a bidirectional TVS makes sense. Regarding your isolated circuit, what matters is the voltage between AIN0/1 and the TVS ground pin.  I have looped you into the Precision Data Converters Forum for more help related to the ADS1220IRVA.

    Regards,

  • Guy, thanks for the response.

    I appreciate that thermocouples are bipolar (a few 10s of millivolts), which is why the sensor is biased to around the analog supply (unipolar 0 to 3.3V)  mid-point of about 1.65V. Since the TVS diodes are referenced to ground, the voltage on both sides of the sensor should remain positive with respect to the TVS diodes (and are in fact clamped by diodes in this range). So it seems to me that there must be some other reason.


    A colleague of mine has suggested the following possible explanation...

    The main reason for using bi-directional TVS parts is because most transient immunity tests in the relevant standards require five pulses of each polarity and similarly for insulation co-ordination impulse tests. In the case of the latter there is a further requirement which limits the “leakage current” permissible and I assume that unidirectional protection, which limits to one diode drop in the reverse direction, could exceed the leakage current requirements. For impulse testing circuit inductance plays a huge part in the immunity of the product and ground connections may not look like “ground” to the generator.

    Is this a likely explanation?

    /Ross

  • Hi Ross,

    Your colleague is mentioning IEC tests for electromagnetic compliance (EMC), which not only include conducted emissions testing but radiated as well.  Your system may or may not need all of the components shown in the TI Design. 

    However, consider that TCs are usually on long lengths of wire that can cover a considerable distance through a harsh environment.  One example may be equipment monitoring in an industrial environment.  This use case usually needs considerable input protection due to EMI/RFI that is conducted along the TC wiring.  This interference can be many volts and can be either positive of negative relative to the ADC ground.

    Another use case may be a food thermometer, where the wire length is short and fairly well insulated from the environment.  In this case the required input protection may be minimal.  The use case and IEC test requirements for the instrument being developed will determine the amount of protection required.

    You might think that it would always be best to add the maximum amount of protection to the devices being used.  This methodology comes with a cost aside from the cost of the components.  Any additional components added to the analog inputs can drift or create leakage paths that will alter the accuracy of the measurement.  Also, any RC filters added to the inputs will have analog settling time for any change in input voltage.  Ususally this is not so much of a problem with temperature, as temperature change cannot be instantaneous.  However, large resistors and capacitors can add noise/distortion into the measurement.

    So what is the best input protection solution?  It is difficult to say as it depends on so many factors.  There are literally volumes of information on the topic.  One excellent source is Henry Ott.  I often refer to his book "Electromagnetic Compatibility Engineering".

    Best regards,

    Bob B