Tool/software: Code Composer Studio
Hi,
I am trying to measure the latency when sensor data is generated, transmitted and received by the collector. For this purpose, I have implemented an interrupt (in board_key.h and board. key.c files) at the sensor to trigger the sensor data transmission event SENSOR_READING_TIMEOUT_EVT in order to transmit at will regardless the timeout. So I am collecting the timestamps when the ISR is attending the interrupt and when the sensor is effectively transmitting the packet (indicated externally by the green LED pin DIO7) . Using a scope, I am measuring the time from my trigger signal to schedule a transmission at the sensor and the final transmission time. I am getting a difference between the delay calculated using the timestamps and the value red by the scope about 100ms.
To get the timestamps I am using ICall_getTicks() but I have also used Clock_getTick() and the result it is the same, any idea about what this is happenning?
Second, what is the difference between these two functions to get the timestamps?
Is the RTC running all the time or is frozen when the sensor is sleeping?