This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CCS/MSP430G2553: Log timings is varying in same series of controller

Part Number: MSP430G2553


Tool/software: Code Composer Studio

Hi,

I am using multiple MSP430g2553 controllers (enabling DCO clock source) to log file in SD card every 15 minute. After week of logging the log stamp varies in between different SD cards. Why do same controller log value at different rate even though DCO is stable clock source.

Is there any statistical model which is there to correct for timing errors. Any help will be appreciated.  

Regards,

  • The calibrated DCO for the G2553 is specified in the 1% range (assuming constant voltage/temperature). My calculator says that's >1 hour/week of drift. This is pretty good for an MCU, but poor for a real-time clock.

    If you can, add a 32kHz watch crystal to your design. Even cheap ones are specified in the 10-20ppm range (0.001-0.002%). That's more like 10 seconds/week. The G2553 supplies a few crystal load capacitances, so if you choose carefully you may not even need to add external capacitors.

  • The board is already in production without a crystal. The lowest calibrated DCO value is 1.1 MHz in the MSP430g2553 controller, so to reduce the current further, I had used lowest DCO frequency (by setting RSEL and DCO register to 0x00). But still time is drifting by 12 hrs in 7 days in between different board having msp430g2553 controllers. Any thing that can be done to reduce this drift. The temperate is not stable (change from 28 to 34 Celsius) and I have not used any LDO, voltage varies from 3.8V to 3V.

    Regards,
  • You will probably have to write a clock correction routine that you calibrate for each unit. You need to get a regression of all the sources of drift - hopefully you can measure the temperature (the chip might have an internal thermometer like the MSP432) and adjust the clock on a periodic basis by adding/subtracting seconds.

    For example, if you determine that the clock will lose 2 seconds per hour at cold temperature, If you find you have been at the cold temperature an hour, add 2 seconds to the clock.
  • That's actually not a huge temperature/voltage swing (though I'm pretty sure 3.8V is out of spec). That said:

    > by setting RSEL and DCO register to 0x00

    This takes you quite far from any calibration. Based on the "DCO Frequency" table (slas735j p. 29) this setting gives you 100kHz +/-40kHz, i.e. +/- 40%.

    Yes, TI only supplies 4x calibration constants. But I expect you can compute your own, for e.g. 100kHz, if you have a spare pin. TI example msp430g2xx3_dco_flashcal gives the general idea. It assumes a crystal, but if you feed a 32kHz signal from e.g. a signal generator into a capture (TA0.x) pin, with a few small changes you should be able to use its software FLL to generate constants for your chosen speed.

    This would be an extra manufacturing step (once per board), but wouldn't require any hardware changes.

    [Disclaimer: I have used this program, but not for "unusual" clock speeds. I have written a real-time FLL for a different context, so I'm pretty sure I know what the program is doing.]

**Attention** This is a public forum