This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F5638 temperature sensor accuracy

Other Parts Discussed in Thread: MSP430F5638

My customer is finding that the internal temperature sensor of an MSP430F5638 is consistently reading about 5 degrees C different to the actual measured temperature. When a number of consecutive readings are taken, there's around 1 degree variation between them all. The readings are compensated using the formula in section 1.13.5.3 of the Users' Manual (slau208m), and the calibration values from the TLV structure. Is this sort of accuracy all that can be expected from the MSP430F5638, or is there something wrong here? I understand it's not unheard of for the calibration values to be incorrect. The device markings indicating date and batch codes are:

1BARHRTG4
430F5638
REV D

A vaguely related observation is that there are no REF calibration values in the Device Descriptor (TLV) Table for the MSP430F563x family. Is the reference on these devices trimmed so that there's no need for calibration in software? Just curious!

Thanks,
John

  • Temperature sensor result is DIE temperature, not environment temperature. The difference between the two depends on current power dissipation and package thermal resistance.

    john5123 said:
    When a number of consecutive readings are taken, there's around 1 degree variation between them all.

    Does your customer obey the required minimumsampling time (30µs or so?)
    Temperature is an average value of atom movement. So on microscopic scale, it has quite some noise (that's why water can vaporize at room temperature). For a stable reeading, some averaging is required, which is take care of (mostly) by the long sampling time.

    The TLV should contain calibration values. Unless you have pre-produciton sample parts (XMS430x).

  • Yes, we've looked at the temperature sensor sampling time. In fact, the specification for the MSP430F563x family is a minimum of 100us, and I believe they tried as much as double that, without it improving the accuracy. There's nothing I can see in the specifications to indicate whether or not 1 degree of uncertainty is reasonable for this device, though the settling time is supposed to be adequate to give <1 LSB conversion error.

    The MSP430F5638 Device Descriptor Table does contain 30°C and 85°C temperature sensor calibration values for each reference voltage value (1V5, 2V, 2V5), but unlike some other MSP430F5xxx devices, the documented TLV structure for MSP430F563x doesn't have anywhere for REF calibration parameters. I see that the device descriptor table for some MSP430F5xxx devices has Reference Calibration factors defined as 0xFF (in the datasheet), which I presume must mean the reference voltage has been factory trimmed for accuracy. The MSP430F51x2 datasheet (SLAS619I) is an example of this.

    It appears to be fairly standard for the nominal 30°C and 85°C calibration temperatures to be allowed ± 3°C of tolerance, which looks like the primary limitation on accuracy when the internal temperature sensor is used. Worst case, if each calibration temperature is off by 3°C and in opposite directions (+ & -), then the accuracy could be significantly out at the temperature extremes.

  • john5123 said:
    we've looked at the temperature sensor sampling time. In fact, the specification for the MSP430F563x family is a minimum of 100us,

    Sure? My datasheet (slas650B) lists a minimum of 30µs. However, more is better (or at least not worse).

    john5123 said:
    There's nothing I can see in the specifications to indicate whether or not 1 degree of uncertainty is reasonable for this device,

    The test conditions of these 30µs are "Error of conversion result <= 1LSB". Now, how many degrees 1LSB are, is a different story (on 1.5V reference, this would be 0.163°C)

    john5123 said:
    the documented TLV structure for MSP430F563x doesn't have anywhere for REF calibration parameters.

    Indeed, the TLV structure description doesn't list calibraiton values for the reference. Now the 563x has a base error of only +-1% while e.g. the 5438 (with calibration values) has +-1.5%, and the center voltage is not exactly 1.50/2.00V. So maybe the better base precision has made teh additional calibration obsolete (especially since the calibration values have a limited precision too).
    You may do a calibration on your own, e.g. by measuring an exactly known and heavily filtered VCC with internal reference and calculate the error. Or alternatively, use a precision voltage on any analog input. Likely, this gives you a better calibration than you could ever expect from the factory (except for high-priced devices).
    If you have external circuitry, you'll need to calibrate it anyway.

    john5123 said:
    I see that the device descriptor table for some MSP430F5xxx devices has Reference Calibration factors defined as 0xFF (in the datasheet), which I presume must mean the reference voltage has been factory trimmed for accuracy. The MSP430F51x2 datasheet (SLAS619I) is an example of this.

    Not necessarily. The datasheet lists the reference base precision as +-1.5% liek on most other devices with calibration values. It may mean that this device come swihtout calibraiton values but the TLV struction has them planned (so default 0xffff is used) or maybe the TLV description should read 'per device' ad it is a typo or a leftover from an initial revision where the reference values were planned but not filled (e.g. in teh preliinary version of the datasheet for the pre-production XMS430 devices).
    Unless you have a produciton device and these positions indeed contain 0xff, I'm suspicious here.

    john5123 said:
    It appears to be fairly standard for the nominal 30°C and 85°C calibration temperatures to be allowed ± 3°C of tolerance,

    My guess is that this is a result of the claibration method. Masically, the die has to be put into a climate chamber and heated to the two temperatures. Now the time required for bringing it to the exact temperature is rather long. TI wouldn't produce enough devices (or would need to build some more clean room halls) if they would wait for it to have the exact temperature for sure. And time is money. Their time, your money. So I think, the +-3% isn't due to the temperature sensor itself but rather due to the ambient/die temperature error at calibration time.

    If you need higher accuracy and have the time and equipment, you can calibrate it yourself. Note that the temperatur esensor measures die temperature, not case temperature. Applying a heat sink (or rather a heat source in this case) helps, but internal power dissipation and die/case/ambient thermal resistance need to be taken into account for the final tenth of a degree.

  • The F5438A has 0xFFFF for the ADC12 offset value in the TLV (at 0x1a18). But the other values seem valid (1248 for the offset). They do NOT, however, give correct results for the temperature.

    Using the offset in the TLV (1248) doesn't give readings that are even remotely close to the actual value. Through experimentation, an offset of 1907 does give better results.

    Does this calculation look correct for degrees F?

        long degF = ((reading - Voffset) * 1224) / 4096 + 32;   // Voffset = 1907

    What is the recommended way to calibrate the temperature sensor?

  • The offset and gain values in teh TLV are for voltage conversion. Sure, temperature reading is voltage covnersion too, but you don't know offset and gain of your voltage source (the temperature sensor), so the offset and gain of the ADC won't help you at all.

    If you do a two-point calibration of the temperature sensor, you don't need the ADC offset and gain values at all, as you already have the results. This is why there are (normally) two values for the temperature sensor and two sets for the different reference voltages.

    If you know the ADC reading at 30° and at 85°, you know the number of ADC counts per °C, independent of any ADC offset or gain or reference precision.
    And once you know this, you can count down from teh 30° reading to get the offset for 0°C.

    If you don't have the values, you can do them yourself. Check the reading art room temperature (and know room temperature), then put it into an oven with ~80° (and use a thermometer to get the exact temperature), and you're done. That's actually what TI does too at production time.

**Attention** This is a public forum