Hello,
Datasheet on page 522 states that:
The RTC value can be read
by first reading the HIBRTCC register, reading the RTCSSC field in the HIBRTCSS register, and
then rereading the HIBRTCC register. If the two values for HIBRTCC are equal, the read is valid.
Why is this? What happens to the subsecond register when hibrtcc is going +1?
I wrote two functions that I can use interexchangebly:
uint32_t get_rtc_posix_sub(void) { return (HibernateRTCGet() << 15) | (HWREG(HIB_RTCSS) & HIB_RTCSS_RTCSSC_M); } // notice: this recursive variant does it as stated in datasheet. uint32_t get_rtc_posix_sub_formal(void) { uint32_t posixTime = HibernateRTCGet(); uint32_t subseconds = (HWREG(HIB_RTCSS) & HIB_RTCSS_RTCSSC_M); if(posixTime==HiberNateRTCGet()) { return (posixTime << 15) | (HWREG(HIB_RTCSS) & HIB_RTCSS_RTCSSC_M); } else { return get_rtc_posix_sub_formal(); } }
Basically after getting my rtc to work, all i want is ability to timestamp things with posixTime + 15 bits subsecond. Since RTC provided does not have a calendar, it counts from 9 to 3600 x 24
So since this wont ever exceed 17bits, I shift the seconds counter by 15bit, and put the subsecond counter to this 15 bit blank to get a second + subsecond timestamp.
Any ideas greatly appreciated. Are there any previous examples of work for similar purposes?
Best Regards,
C.