This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F2618: What's the calibrated ADC12 accuracy? Where is this documented?

Part Number: MSP430F2618
Other Parts Discussed in Thread: MSP430F5359,

We're currently using the MSP430F2618 and transitioning to the MSP430F5359. For both of these products, I've read the respective data sheets but if I'm reading them correctly, the ADC12 sections of the data sheets are specifying the ADC accuracy in terms of its uncalibrated accuracy. But these MSP2 contain calibration data in Flash Information Segment A, right? And I believe we're using that data.

Is there documentation that talks about the calibrated accuracy of the ADC12? And which of the various ADC12 accuracy parameters are affected by the calibration factor(s)?

(And if I just missed an Application Note or some such, please don't be shy about telling me to go read that! ;-) )

  • Hi Atlant,
    I checked the data sheet, and it doesn't mention the values being uncalibrated, those values should be for a calibrated ADC. What's giving you the impression that it's uncalibrated?
  • Cameron:

    > I checked the data sheet, and it doesn't mention the values being uncalibrated, those
    > values should be for a calibrated ADC. What's giving you the impression that it's
    > uncalibrated?

    (I apologize for the delay in responding! I had to spend significant time working on our
    older MSP430F2618-based product but yesterday, I was finally back working on the
    ADC12 subsystem in both families.)

    I guess what gave me this impression is that the only part of the Data Sheet that mentions using
    the TLV Calibration Data is the temperature sensor associated with the ADC12:

    All of the other paragraphs dealing with either the ADC12 or VRef are silent on the topic of calibration.

    And when one looks at, say, the data for REF:

    One sees VRef+ specified as +/-1%. Give that the calibration factor is specified in terms
    of parts in 32768 (~30ppm), I'd have expected that the calibrated accuracy of the Voltage
    Reference would be better than +/-1% (+/- 10,000 ppm). Instead, it should be about whatever
    the accuracy of the instrumentation was in the chip tester. (I used to work for Teradyne, BTW,
    and I'm proud of the accuracies we could achieve; for stuff like DC voltages, they were 'way
    better than +/-1%, being calibrated back to an HP 5-1/2 digit systems voltmeter. ;-) )

    The same thing comes up when I look at the ADC12 specs:

    The TLV contains calibration factors for the ADC12's Gain and Offset and again, these are
    specified in terms of parts in 32768 (~30 ppm). But Offset error is specified as +/-5.6 LSBs
    (+/- 0.14%) max and Gain error is specified as +/- 2.5 LSBs ( +/- 0.06%) max. I know we're
    starting to push up against instrumentation accuracy limits here, but even these might have
    been calibrated to tighter limits.

    It's okay if the Data Sheet specs really are the calibrated values, but I guess I'd like to see
    the Data Sheet state that for both the ADC12 and VREF peripherals rather than simply staying
    silent. And I'm asking, in part, because my Systems Engineer is pushing me on "Just how
    accurate are these ADC channels?
    " so I really need to be able to tell him, "Yes, what the
    Data Sheet says is the best we can do
    " or "No, using the TLV calibration factors, we can
    do x-amount better
    ".

    Atlant

  • Atlant,
    I see your point. I'll double check with our systems team. In fact, you may be right. I noticed in one of our FRAM parts, it mentions it was tested without TLV calibration.
  • Cameron:

    Thanks! If the ADC can be more accurate than what's stated in the data sheet, you'll make our System Engineer (and our Controls Engineers who are the ultimate customers of the ADC data) very happy. ;-)
  • Hi Atlant,
    Our systems team confirmed that these values are tested uncalibrated. My mistake. We don't have a spec for the improvement you would see with the calibration implemented, which is unfortunate.
  • Cameron:

    > Our systems team confirmed that these values are tested uncalibrated.

    Thanks! That makes a lot more sense.

    > We don't have a spec for the improvement you would see with the calibration implemented, which is unfortunate.

    This would actually be tough to specify because it would depend upon the
    prober keeping the wafer-under-test at a nice constant temperature (e.g.,
    30°C) and even then cal factors would only be good for that one temperature.
    And the accuracy spec would have to consider the accuracy and repeatability
    of the instruments in the chip tester as well as the repeatability of the ADC12
    and VREF systems.

    No impossible, but not trivial either.

    Thanks for your help — I'll mark the issue as "Resolved".

**Attention** This is a public forum