This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

RTOS/LAUNCHXL-CC1310: High clock drift?

Part Number: LAUNCHXL-CC1310
Other Parts Discussed in Thread: CC1310,

Tool/software: TI-RTOS

Hi,

I'm characterising the clock drift on a bunch of CC1310 Launchpads.

The crystal specified in the LaunchXL-CC1310 schematic is rated to +/-30ppm, yet I'm measuring around +110ppm (+/-20ppm or so) on all the boards I'm testing by comparing Clock_getTicks against my reference clock.

This is *after* enabling temperature compensation, before which I was seeing more like +215ppm +/-60ppm.

My measurement cycles run for at least 12 hours, since short term drift measurements are fairly useless in terms of precision.

If I force the board to stay awake running on the HF_XOSC (e.g. by starting a never-ending radio command), the drift is only +/-15ppm or so.

My reference clock source is synchronised to various atomic clocks via NTP and has been well characterised and corrected for drift over several years, so I don't expect a lot of drift from this source - certainly not more than 8 seconds per day! And that also doesn't explain excellent readings when the launchpad runs on HF_XOSC only.

I see high drift only when I allow it to standby, presumably using the LF_OSC as timing source since that's what startup_files/ccfg.c lists as the default - #define SET_CCFG_MODE_CONF_SCLK_LF_OPTION               0x2        // LF XOSC

I have tried trimming this drift using SET_CCFG_EXT_LF_CLK_RTC_INCREMENT - and confirmed that my changes are present in Debug/ccfg.obj and the final output ".out" file - to no avail.

Even with RTC increments up to 2000ppm (0.2%) from nominal (eg 0x7FBE76 = 32833.67Hz should make it run noticeably slower), the board seems to initially take the setting but quickly returns to +100ppm or so.

Has anyone else attempted to characterise the clock drift on these boards, and seen or solved anything similar?

Tomorrow I'll try reading the RTC directly in an attempt to determine if the problem is actually LF crystal drift or an issue with saving/restoring Clock_getTicks over periods of standby, but today I'm out of time.

  • Hi,

    you are talking about the LF clock, correct? Is it maybe related to the System_flush() issue that you have observed?

  • Since I am using UART console (at 2MBaud) via SysCallback and in fact don't even have CCS (or any other debugging software) attached to the debugger while running these tests, the SysMin/flush issue I have identified in another thread is not a candidate culprit - unless the XDS110 does debugging stuff all by itself without any host software talking to it?

    I wonder if TI RTOS compares LF clock to HF clock and automatically corrects the former, which is why my RTC_INCREMENT override only affects the first few minutes of running?

    If that's the case, it seems like TI RTOS's compensation may somehow aim high. What would be the easiest way to determine if this is the case?
  • Hi,

    yes, the power driver performs automatic calibrations and may overwrite SUBSECINC, but only for RC oscillators. The whole calibration procedure is unfortunately not documented at all.

    You can disable calibration with the power driver struct in the board file:

    const PowerCC26XX_Config PowerCC26XX_config = {
        .policyInitFxn      = NULL,
        .policyFxn          = &PowerCC26XX_standbyPolicy,
        .calibrateFxn       = &PowerCC26XX_calibrate,
        .enablePolicy       = true,
        .calibrateRCOSC_LF  = false, // Disable LF clock calibration
        .calibrateRCOSC_HF  = true,
    };
    

    However, this does not apply when running on a LF crystal. I wonder what the root cause could be in your case. You say it happens when you switch between STANDBY and RUNNING mode. Does it occur even without having a debugger attached?

    Please note that SET_CCFG_EXT_LF_CLK_RTC_INCREMENT is ignored when using an external LF crystal. It only applies to an external clock signal. If you want to modify the AONRTC_SUBSECINC register, then you would have to do it in your application code:

    #include <ti/devices/cc13x0/driverlib/setup_rom.h>
    SetupSetAonRtcSubSecInc(VALUE);
  • The whole calibration procedure is unfortunately not documented at all.

    It seems I'm finding many documentation holes on my journey through TI's toolchain ;)

    Does it occur even without having a debugger attached?

    Yes, I rarely run under the debugger these days. Typically I only attach the debugger when something has crashed.

    If you want to modify the AONRTC_SUBSECINC register, then you would have to do it in your application code:

    Thanks, I'll try that