We are using TMP117 sensors in some instrumentation. Customer requirements demand that we verify the performance of these sensors.
Unfortunately, they seem to have significantly more error than the datasheet guarantees.
we are using the BGA package, and the TMP117 is the only part on the PCB - quite literally a pcb with wires extending from it for power and i2c.
We have four of these sensors buried in a copper slug, along with 2 calibrated Amphenol IRTD-400 SPRTs. The slug is then placed in a liquid bath calibrator. Probe #1 was calibrated roughly 9 months ago, Probe #4 was calibrated within the past 30 days. At calibration, It's worst calibration point was 0.008 degrees off, so Probe #4 is quite trustworthy. Amphenol's NIST uncertainty is the in single digit millikelvin range. Probe #1 and #4 are also reading within 0.030 degrees of each other, so my confidence in the measurement is high.
The TMP117 sensors, are all reading high by between 0.0864 and 0.2193 degrees, with 3 of the 4 being > the datasheet 0.1C of error.
We haven't even gotten to testing the extremes of the temperature range yet - this is purely a ~room temperature test.
Our test setup measures the stability of the IRTD-400 probes, and as is shown, the temperature of the liquid bath remained very stable for 30 minutes before these values were collected, giving everything plenty of time to thermally equalize.
Can TI shed any light on this? Are the sensors calibrated before or after packaging? Does the BGA package suffer from reflow induced shift? Other than the comments in the datasheet about accuracy, is there anything we can do to maintain the datasheet performance of the sensors?