This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

RTC calibration mass production

Other Parts Discussed in Thread: MSP430F5335, MSP430F1232

We are using RTC_B of MSP430F5335.


Per the user's manual, a calibration of RTC Clock is required by outputing RTCCLK and measuring it with

a frequency counter. 

Since the crystal used is the primary factor. My quesiton is how to do it for mass production,

or should we assume a batch of crystals should lead to same calibration result.

I did found a similar post but it did not give an explicit answer.

http://e2e.ti.com/support/microcontrollers/msp430/f/166/t/134177.aspx


  • qinghui tang said:
    or should we assume a batch of crystals should lead to same calibration result.

    You can't assume that. Only thing which you can possibly rely on - manufacturer's specification.

    qinghui tang said:
    Since the crystal used is the primary factor. My quesiton is how to do it for pass production,

    You need special calibration firmware and connection to external time reference:

    1st option: use 1pps pulse from GPS receiver as reference time. You shall count crystal clock periods during 32 sec of reference time period, then you can calculate correction with roughly 1pps precision.

    2nd option: use 10MHz output from quality (GPS-synchronized) frequency source as timer input clock. Then you can do calibration in much shorter time, like 1sec.

  • IIMars,

    Thanks for the quick answer.

    My understanding is that even with manufacture spec, each individual crystal still has different drift,

    some are 32768.5 Hz, some are 32767.4, etc, all within certain +/- ppm range, is this correct?

    Since the final product shipped will not have 1pps/10MHz clock reference with it.

    So I guess my steps ares

    1) using reference time to calculate clock drift.

    2) finalize firmware with appropriate math ( 5xxx series user guide has enough details)

    3) use the same clock drift & firmware for all production units.

    Am I on the right track?

    Thanks

  • qinghui tang said:
    some are 32768.5 Hz, some are 32767.4, etc, all within certain +/- ppm range, is this correct?

    Yes. Supposedly with ppm error that does not exceed manufacturer specification.

    qinghui tang said:
    So I guess my steps are

    1) load calibration firmware

    2) calibrate using precision time source, save resulting calibration data in the nonvolatile data flash of each chip

    3) erase flash except area containing calibration data, then load production firmware that will use calibration data during operation

    You shall do those steps for every single device you manufacture.

  • Thanks.

    One more question.

    If our crystal has ppm as +/- 25ppm, then worst case the drift is 

    24*60*60 * 25 * ( 0.000001) = 2.16 second pen day.

    If 2.16 second a day for our application is acceptable, then we don't necessarily need to do calibration, right?

    Unless the drift of RTCCLK after going through MSP430 is much larger than +/- 25ppm, 

  • Ilmars said:
    So I guess my steps are

    1) load calibration firmware
    2) calibrate using precision time source, save resulting calibration data in the nonvolatile data flash of each chip
    3) erase flash except area containing calibration data, then load production firmware that will use calibration data during operation[/quote]

    It is much easier if the calibration firmware is part of the application and runs if no calibration information is found in non-volatile storage (e.g. InfoA).
    So during final test in manufacturing, a precision clock source is applied and the calibration takes place. The calibration firmware should contain reasonable limits, so it will detect whether the calibration result is valid or not (e.g. no reference clock) So no need to flash two firmwares.

    We too do calibration with two separate firmwares in one of our products, but only because even without the calibration code, we only have a few bytes left in flash (it's an MSP430F1232). Having it built-in, as in out other devices, makes production so much easier. (we even can manually trigger the calibration through commands in a few devices)

  • An alternative way is a manual calibration. For RTC, we use an external I2C RTC (because on older MSPs, internal RTC was not backed up by separate supply). This one was giving a 1s pulse on a pin. We added an adjustable capacitor to the crystal ,and for calibration we compare the 1s clock signal with a precision 10MHz timer/counter and detune the crystal manually. Works well for <+-2ppm (less than 2 minutes per year)

  • qinghui tang said:
    If 2.16 second a day for our application is acceptable, then we don't necessarily need to do calibration, right?

    Indeed.

    qinghui tang said:
    Unless the drift of RTCCLK after going through MSP430

    Do not forget PCB traces too. So basically you shall do clock precision test on statistically enough initial batch samples. If tests are OK then decide. After you shall continue to periodically test selected samples

**Attention** This is a public forum