We are making BER tests for product using CC1200. Our symbol rate is 4800 and deviation about 2.4 kHz. Modulation is 4-GFSK. Channel filter is around 19 kHz wide.
We have noticed that turning frequency offset correction on causes higher BER than with correction off. Frequency error is less than 100 Hz between transmitter and receiver so there is no real need for correction in this test situation. In the final product we need to use correction as frequency errors can be as high as 2-3 kHz and CC1200 tolerates 200-300 Hz error when correction is off.
In the tests we transfer 100-byte packets with random content. We receive 1000 packets per signal level and the signal level is swept between -60 and -120 dBm.
When frequency error correction is turned off, we cannot see bit errors at signal levels higher than about -90 dBm and BER performance level of at least 10^-6 is observed (10^-6 is our current limit in BER graph and taking into account the amount of data sent). When we turn correction feature on bit errors start appearing and our BER will be limited to around 10^-5. I know this is still a good level for most applications but our reference devices are some of our earlier products with different HW and they are capable of error free performance at these signal levels.
I wonder why BER performance degrades when correction is turned on?
Our frequency offset correction setting is “FOC after channel filter” so we don’t use/need FB2PLL feature. Our preamble is 21 bytes, preamble detector is on and part of the preamble is used for sync word as well.