Hello,
Question on performing bit/packet error rate tests on the cc1120. We are trying perform a BER test while the cc1120 is on our product using the synchronous serial rx mode. We want to evaluate if there is any sensitivity degradation when the cc1120 is on our product.
While we are able to successfully retrieve our packets, we are also getting garbage data in between the transmission of the packets. In other words, the cc1120 continues to output data even if there is no packet and only noise coming in. I would have expected the cc1120 to know when the end of the packet was and to stop outputting data at that point.
My questions are:
- Are we supposed to receive garbage data in between packet transmissions while operating in sychronous serial mode?
- If yes, is the idea here to use a signal such as PKT_SYNC_RXTX to 'mask' the data (or as data valid signal) to alert the software that this is a valid packet?
- If yes, is the idea here to use a signal such as PKT_SYNC_RXTX to 'mask' the data (or as data valid signal) to alert the software that this is a valid packet?
- Is synchronous serial mode the best mode to achieve a BER test?
- Is synchronous serial mode identical to FIFO mode except for the fact that the data is being output on GPIO pins?