This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC calibration problem

Other Parts Discussed in Thread: TMS570LS3137, HALCOGEN

I'm trying to calibrate the ADC of the Hercules evaluation board for the TMS570LS3137.
I have configured to ADC to use 12bit values, but I get 10bit values from the ADCALR register instead of 12bit values:
- low: 0
- middle-low: 426
- middle-high: 597
- high: 1023

In the figure 18-10 from the technical reference manual (spnu499) is shows that the ADCALR only has 10bits. What is correct?

In addition I have an understanding problem: The correction that I can configure in the ADCALR is just an offset independently of the analogue value that I wants to convert. But as shown in figure 18-11 in the TRM the real transfer function has not just a constant offset relative to the theoretical transfer function! Does the error offset "correction" generate more error than it should correct? I think that I have to set the ADCALR to 0 after the calibration and then correct the calibrated values by my self.

  • Hi Daniel,

    Thanks for posting your Query, We will get back to you as soon as possible.

    Best Regards
    Prathap

  • Hi Daniel

    ADCALR is mentioned as 12 bits in SPNU499


    Yes the correction is the offset value independent to the Analogue value.
    The calibration just produces the offset that can arise due to various factors. This offset(signed) value will be automatically added to the ADC result at the end of the each conversion.

    For reference you can check the HALCoGen 3.00 ADC driver Calibration API.