Hi,
I'm using the CC2650 Sensortag to acquire movement data at 100Hz.
To achieve this I modified the following sections of the firmware, setting them to 10ms:
In st_util.h, SENSOR_MIN_UPDATE_PERIOD
In sensortag_mov.c, SENSOR_DEFAULT_PERIOD
Conversely I developed an Android App to acquire the data, based on the SensorTag App.
I started doing some tests, and I found a strange behavior of the sampling frequency, it normally acquires at 100Hz but sometimes the frequency goes down to 30Hz.
After this I modified my App in order to start/stop an acquisition every minute for one hour, in order to asses the frequency behavior of each acquisition.
Attached is a figure with the mean acquisition frequency, over the several acquisitions. In which it can be seen how the frequency falls to 30Hz and at some point rises again to 100Hz.
After seeing this, I used the Packet Sniffer to see if there where any differences in the transmission itself in terms of extra packets or something similar. But there are none, the only difference can be seen in the following figure, and is the time difference between successive data-carrying packets.
In the figure, you can see the Acquisition Time Difference (from CC2650 Timestamps) and the Transmission Time Difference (from Sniffer Timestamps) when the mean sampling frequency is 100Hz, and when the mean sampling frequency is 55Hz (in this case, it is possible to see how around the sample 25 the frequency drops to 30Hz - yellow-).
I have considered elements as the battery charge, the timestamps (they were originally added from the app and now the come from the SensorTag itself), and the app itself. But I don't seem to find an explanation to this behavior nor a solution. I find it very strange, but now I'm guessing is a problem of the device itself or the data transmission.
Any ideas on why this is happening? Or any suggestions on how to fix this?
Thanks in advanced,
Alejandra