I'm having issues with the RTC drifting on my TM4C1294NCPDT project. I've read about the trim register and am trying to use that for correction.
My idea is to interrupt using the RTC match register once per second. Then using Timestamp_get32(), calculate the number of system clock ticks that have transpired since the last RTC second tick. This should match the frequency that I'm driving the main clock with. If it's different, adjustments to the HIBRTCT register can be made to adjust the RTC rate accordingly.
My problem is that no matter what value I place in the HIBRTCT register, I always get roughly 120004000 ticks between RTC interrupts. (The system clock is running at 120 MHz.) My understanding is that the RTC and the system clock are running on two different oscillators, but my experiments would seem to indicate otherwise. I've double checked that the system clock is configured to run from OSC0 and OSC1 (16 MHz) and the RTC is configured to run from XOSC0 and XOSC1 (32.768 kHz). I've also confirmed that the OSCDRV is set correctly for the circuit on this project.
Is my theory valid? Can I use the system clock to adjust the RTC? Are there any other methods of calculating a value for HIBRTCT to correct for clock drift? (The system is not connected to a network, so no authoritative time source is available.)
Below is the temporary code I'm currently using for the HIB interrupt for my testing:
static Bits32 delta = 0xFFFFFFFF; extern Swi_Handle clockDriftSwi; Void HIBHwiFxn(UArg a0) { static Bits32 lastStamp = 0xFFFFFFFF; Bits32 currentStamp = Timestamp_get32(); delta = currentStamp - lastStamp; lastStamp = currentStamp; Swi_post(clockDriftSwi); HibernateIntClear(HIBERNATE_INT_RTC_MATCH_0); } Void clockDriftSwiFxn(UArg unused1, UArg unused2) { System_printf("%ld\n", delta); delta = 0xFFFFFFFF; }