Hi,
I am currently characterizing a couple of prototypes based on CC1101 and I am a little bit puzzled by the RSSI values i am getting.
The set-up is like this: I have a signal generator transmitting at 0dBm output power through a power splitter. Connected to one of the output of the power splitter is a spectrum analyzer to provide calibrated power reading. The outer output of the splitter goes to the DUT in receive mode. One of the modules returned -33dBm (on SmartRF Studio) when the spectrum analyzer indicated that the reading should be -8dBm (cable losses & splitter insertion loss).
We initially thought that there was a problem with the RF circuitry between the CC1101 and the cables that contributed ~25dB loss. However, when we connected the module directly to the spectrum analyzer and set it to transmit at 0dBm, the spectrum analyzer returned a ~(-3dBm) reading. This seems to contradict our initial assessment about the ~25dB loss. It seems that 25dB difference is either due to a CC1101 sensitivity problem or the applicable RSSI offset for this particular module is 49 and not the typical 74.
Since the datasheet does not indicate only a typical value of 74 for the RSSI offset, is 49 still acceptable or does this indicate that this particular module has a sensitivity problem?
What is the acceptable range of RSSI offset values?
Any other factor that could explain the discrepancy?
BTW, the other modules more or less conform to the typical RSSI offset value of 74.
TIA.