In our application we send ~1000-1200bytes every ~290ms.
On device side we plot the sl_Send is called exactly every 288-290ms.
On our server side we also plot the frequency of the incoming packets and we see that there is a jitter every 10seconds when the SCANNING INTERVAL is set to 10s.
When SCANNING INTERVAL is set to 30s the jitter happens every 30s.
The jitter means that sometimes packets are incoming with 600ms delay. Sometimes they arrive with 5000ms delay (5seconds).
Having a delay of +320ms to 280ms is not something we would bother.
But the higher delays above 1000ms bothers us and we would like to understand why they happen and what causes them.
Note that this is not internet/routing caused delay, measurements was reproduced over the internet and on LAN with a notebook acting as client (without any delay) and the CC3235 acting as a client.
we always saw that using any other client we have no jitter at all, using the cc3235 as the client we start to experience packet delays.
Questions:
- what other feature can cause a delay?
- how long scanning takes when enabled on all channels (2.4G / 5G?)
- can we disable scanning and only enable it when the RSSI of the connected SSID is low?
- we plan to use the roaming feature (there is an issue for that - it doesnt work at the moment): what triggers the roaming? The scanning result or a certain type of packet's RSSI? OR will the roaming trigger a scan?
Scan interval set to 10s and 30s (X axis does not reflect time)
random jitters of 1600-1800ms
random jitter of 5000ms: