Hello,
I am trying out the Transceiver app on CC3200MODLAUNCHXL.
10000 packets were pumped using the Rate number 13, i.e. at 54 Mbps.
From the code we see that each packet is 320 bytes long.
I have seen on the Receiver side that the packet received is 1472 bytes long.
Q1. Why this discrepancy?
Observations for the next question:
Sending out 10000 packets requires some 10 seconds.
If we assume that indeed each is a 1472 byte packet, then each packet should take about ((1472 * 8)/(54 * 1000000)) = 0.00021807 sec = 0.2 ms
and in reality it takes about 1 ms (= 10 sec/ 10000 ).
Further we are getting a throughput of ((1472 * 8 * 10000) / 10) = 11.776 Mbps. And if it is actually a 320 byte packet then this rate will be something like 3 Mbps.
So instead of 54 Mbps we are getting only 11.776 Mbps.
Another way to look at it is that each non-transmitting portion of iteration of the transmitting loop of this application takes about 0.8 ms (= 1ms - 0.2 ms). And if it is actually 320 bytes packet, each iteration will be something like 0.95 ms.
Q2. How does one get the 54 Mbps rate?
Q3. Is the execution of the code on the CC3200 really taking so long that leaving out the time taken for send operation, each iteration of the loop takes about 0.8 or 0.95 ms?