This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F28069: ADC Accuracy over temperature

Part Number: TMS320F28069

I'm using ADCINA6 to measure an external temperature.  The value seems to fluctuate a lot when varying ambient temperature.

Setup

  • Internal BG reference (ADCREFSEL = 0)
  • VREFLO tied to GND
  • SOC6 CHSEL = 6
  • SOC6 ACQPS = 15
  • SOC6 TRIGSEL = 1 (CPU Timer 0)
  • CPU Timer 0 setup to 100usec and enabled
  • CPU Timer 0 ISR reads the ADC value

No re-calibration or offset re-calibration is being performed (I've tried to run the AdcOffsetSelfCal() routine periodically but that did not seem to help).

I'm seeing at ~15C (and lower) ambient temp around 50 counts (0.042V) difference than when the ambient temp goes above ~18C.  I'm trying to interpret table 6-27 from SPRS698G to see if this is expected.

Here is the table from the document along with me trying to convert it to V (3.3V scaled) on the right for my reference:

min typ max unit         min typ max unit
offset error -20 20 LSB -0.01611 0.016113 V
overall gain error (int ref) -60 60 LSB -0.04834 0.04834 V
temp coeff (int ref) -50 ppm/C -0.00017 V/C

Is this saying that I can expect +/- 60 counts (+/- 0.04834 V) of error due to the gain error, and that my ~50 count (~0.042V) differences are within spec, or is there something else I should do to try and re-calibrate to get more consistent results over different temperatures?  I don't have any reference voltages tied to any ADC inputs that I could use to try and see if there is an offset.

  • Anthony,

    Can you describe the signal conditioning for the temperature sensor?  Is it buffered?  Are there filtering components between the sensor and the ADC-A6 pin?  Larger ACQPS values may help if the sensor is not buffered.

    The datasheet specs apply over supported operating conditions.  Across temperature, you should expect the ADC Gain Error to be within +/- 60LSBs of ideal.  What you have observed (50LSB of shift) may fall within the +/- 60LSB error because it technically allows for up to 120LSBs of shift around ideal.

    However, if we consider the Typical Temp Coefficient of -50ppm/C, something else may be contributing to your error.  The Typical parameter is not a guarantee or bounded value like Min and Max parameters, but it does indicate how we expect an average device to behave.  With -50ppm/C and an operating range from -40C to 125C, a typical device might see about -50ppm / 1M x 165C * 4096LSB = -34LSB of full-scale shift over temperature.  You are reporting greater shift over a smaller temperature range so it is worth some additional debug.

    Something I would recommend is enabling the ADC late interrupt and reading the results from the ADC ISR.  This will help to ensure that you are reading the results of the most recent trigger, rather than the result from the last trigger from 100us prior.

    After this is in place, I recommend configuring all 16 SOCs to convert the ADC-A6 temperature sensor using a single trigger so that you can see if the continuous values are wildly fluctuating or if there is some noticeable pattern.

    -Tommy

  • Anthony,

    It has been a while since your last update.  I assume that you were able to resolve your issue.

    If not, please reply to this thread.  If the thread has locked due to timeout, please create a new thread describing the current status of your issue.

    -Tommy

  • Sorry I haven't updated the thread.  The signal is buffered -- There is an RC filter at the input to the processor, 4.99k and 2.2uf.  The capacitor at the processor pin.

    We ended up increasing the ACQPS value to 25 and the issue has seemed to go away (still working on testing over all temperature ranges).  So it appears that the buffering was not sufficient.