We have a prototype Zigbee network of sensors and basestation, both designs using the CC2530 + CC2591 RF front-end. The sensor uses an on-board chip antenna, is battery operated and mounts inside a plastic case. The basestation bypasses the chip antenna and uses an external directional +8dBi antenna connected via an U.FL connector. The sensor is designed to mount on the back of sector antennas on cellular towers. It communicates on-board accelerometer and magnetometer info via Zigbee to the basestation, which is located at the bottom of the tower, with the +8dBi antenna pointed towards the sensor. Typical distance between the two devices is 50 meters.
On the ground, the line-of-sight range of the two devices is more than adequate (~300 meters). The problem occurs when the sensor is mounted on the cellular antenna and the cellular antenna broadcasts normal cellular traffic. The basestation cannot receive any reports from the sensor.
We suspect the sensor’s RX is getting overwhelmed by high-power cellular broadband noise. To prove the theory and to quantify the noise, we created a tool for the sensor to report its RSSI and to send the info out its serial port. We tested this tool by combining an interference CW signal of 1910MHz with a normal 2450MHz signal (-40dBm), and injected into the sensor’s antenna port. The tool sweeps all the Zigbee channels periodically, so we have a bar graph of RSSI vs. channel and gets refreshed every ~5 seconds.
As the 1910MHz increased to -5dBm, the “noise floor” which all channels except for 2450MHz, drops by 3dB. A few more dB or power at 1910MHz drops the noise floor by another 5dB. And as the 1910MHz output is increased to +5dBm, the 2450MHz RSSI drops from -40dBm to -75dBm (35dB drop). This is strange to me, as the 1910MHz CW is out of band with respect to our 2.4GHz sensor.
Any ideas of what is going on?