This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Power-supply error, calibration, and the TMP112?

Other Parts Discussed in Thread: TMP112, PCA9306

I am currently planning on using the TMP112 on 3.3V in a dual supply (2.2V and 3.3V) application (a thermostat, to be precise).  This, however, requires a PCA9306 to translate the I2C bus levels.  I noted the formula in the TMP112 datasheet for computing the error caused by using the TMP112 on non 3.3V supplies, however, if the device is calibrated at 2.2V and 25degC, will that "calibrate out" the readout error caused by running the sensor at 2.2V?  Also: what effects would the SOT563's rather high Theta-JA of 260+K/W have on temperature sensor accuracy and response?  Would it simply cause the sensor to be slower to react to changes in temperature? (I'm using it to read the ambient air temperature here, of course :).

  • Lucas,

    I am currently looking into this and will get back to you shorly.

  • Lucas,

    The calibration method in the datasheet is a baseline to start with. If you deviate away from VSupply = 3.3V the calibration method can still be applied. Please be aware that there is additional error added as the supply voltage deviates away from the specified 3.3V.

    Please see the attached power point I have put together regarding the calibration of the TMP112. This presentation assumes that you are using a supply of VSupply = 3.3V. If you deviate from VSupply = 3.3V then there will be additional error. Let me show this in an example.

    Example:

    If you calibrate the TMP112 at room temperature using VSupply = 3.3V there will be a temperature error of zero. If you turn the dial down on the supply to VSupply = 2.2V there is a deviation from VSupply = 3.3V of 1.1V. Using the typical error vs supply value from the datasheet there will be an additional error as follows

    additional error = (0.0625oC/V) * (1.1V) = 0.06875oC

    Now instead of zero error at room temp we expect an error of (0 + 0.06875). This typical error will have to be added to every point on each slope across temperature.

    To address your concern with Theta-JA, the answer is yes. The response to ambient temperature is slowed down due to packaging of the device. The TMP112 is designed to measure very accurately the temperature of a printed circuit board. The ground pin of all temperature sensor ICs is the main thermal conduction path. The ground pin is most commonly connected to the ground plane of the printed circuit board to measure the board temperature. If you are using the TMP112 to measure the ambient temperature it will appear that accuracy is lost because the TMP112 is really measuring the PCB temperature which can usually be warmer than the ambient temperature. This will of course depend on what components are on the PCB and how much impact they will have on warming the board up.

    TMP112 Calibration.ppt
  • I take it that if I calibrate the TMP112 at room temp with Vsupply = 2.2V it will have 0 error at that point, and then the same .06875degC (typ) error you mentioned when the supply knob gets turned up to 3.3V, correct?   Also: is there another part you suggest for 0.5degC(max over 0-50degC, although achieving this after a 1pt calibration at 25degC would be acceptable) accuracy ambient temperature measurement?  Or will the TMP112 suffice, provided that I lay the board out to provide maximum thermal isolation between the TMP112 and the other components? (I was planning to put the TMP112 off in a lower corner of my board with its GND pin tied to an exposed-copper ground area that is grounded via a narrow trace back to the main board ground fill and use similarly narrow traces for Vsupply, SCL, and SDA, as suggested by ADI appnote AN892.)

  • Lucas,

    The layout scheme you presented sounds very reasonable. The TMP112 will be the best option since it has very good PSRR. Calibrating at room temperature using a supply voltage of V = 2.2V  will result in 0oC error assuming the PSRR specification is accounted for in the calibration. Therefore if you calibrate at room temp with the (error at  V = 3.3V) +/- PSRR*(delta_supply) for a supply voltage of V = 2.2V and you were to change the supply back to V = 3.3V the extra error of +/- PSRR*(delta_supply) would then need to be removed in order to maintain the appropriate calculation at the given condition.