This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F4250: DAC self-calibration

Part Number: MSP430F4250

Hello,

MSP430F4250 microcontroller has a DAC inbuilt into it with offset calibration feature. According to the datasheet, the calibration takes max 32ms when AMP setting = 5. I observed that it takes approximately 11ms in my case. Also, the output voltage during calibration is almost zero.

1) Internal to the controller, how is DAC offset calibration implemented? Curious to know the technique behind it.

2) Will the time taken for calibration vary wildly for a microcontroller with the same firmware? Say, 3ms in one instance and 11 ms in another?

(During development of the firmware, I noticed that the controller used to reset frequently. Identified the cause to be the Watchdog Timer- I had kept the WDT running with 4ms overflow period but had not implemented steps to reset it while DAC calibration was underway (~11ms).

HOWEVER, the reset phenomenon was not consistent- the uC used to work properly half the time. How can this be? The way I see it, this can be only if DAC calibration took less than 4ms during those instances when the uC used to work properly. )

This is how I implemented DAC calibration- Set the DAC12CALON bit to initiate calib. Poll the bit until it becomes zero which signifies that calib is complete.

  DAC12_0CTL |= 0x0200;    // Initiate DAC calibration
  while(  (DAC12_0CTL & 0x0200)  ){ 
  }


  • Hi,

    regarding question 1
    - The output of the DAC is checked with an opamp in comparator mode. The result is compared to the offset. If it is off a trim array is used to modify the output amplifier of the DAC. A binary search algorithm is used.

    regarding question 2
    - Talking to the developers the calibration time should always be the same, with only a small variance. So if you measure 11ms, I expect the calibration to be around 11ms.
    - There can be device variance, as internal components like resistors and capacitances can have slightly different values. This is normal process variance.
    - The phenomenon you describe is quite strange. Can you please disable the watchdog and measure the time of the calibration multiple times? Is it always around 11ms or does it change?
    - You say half the time it works. Does it mean every second time it works? Or is it random?

    Best regards,
    Andre
  • Hi,

    do you have further questions regarding this topic? If not, please select "Resolved" for the post that solved your issue so this thread can be closed out. If you have a different question, please select "Ask a related question" or " Ask a new question".
    Thanks a lot!

    Best regards,
    Andre

**Attention** This is a public forum