We are manufacturing in volume a system that consists of a base station and a remote node, both of which use the CC2545 for point to point communications. On most of our systems we see good performance. However, on a small number of systems we see degraded error rate in spite of good RSSI. My question is: what kinds of things could lead to this situation? Are there some parts in which there can be more PLL jitter than in others, leading to poor performance? Could a calibration be drifting in that time? On some systems it appears that we have good performance initially but then after 10 or so seconds the error rate starts to degrade.
Possibly related: When we have done error rate testing, we will see good error rate as we hop through different channels. Then, we will see poor error rate for several channels as if the entire 2.4GHz band is noisy. Then, after a few seconds it returns to good error rate. Again, is there something in the operation of the chip that could drift off off center, causing poor error rate, then suddenly come back to center? In all cases there continues to be good signal strength.
Thanks for your help on this.