This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CC1101 RF Calibration - Optimal interval?

Other Parts Discussed in Thread: CC1101

Hi,

In CC1101 user manual, it is stated that "The frequency synthesizer must be calibrated regularly." 

I am currently using Manual calibartion for a 2 way communication peer-to-peer devices. Hence, prefer to use manual calibration to maximise the RX time.  During testing, it is observed sometimes, a device suddenly unable to receive any messages and the messages it send is not able to be received by others for a certain duration (no TX underflow/Rx overflow as I do a regular check on it). It self recovered after sometimes.  Hence, I am suspecting it might be due to the RF out of syn.

Is there a maximum allowed in-between callibration time for CC1101? Is there any guideline that tell us how frequent do we need to calibrate?

Thank you.

Seng Kee

  • The calibration is done to make the PLL lock to the programmed RF frequency (i.e get the correct VCO current, VCO capacitance and charge pump current). If there is no significant change in temperature (e.g more than +/-20C) or supply voltage  there is no need to do a re-calibration. 

    Not sure what you mean by "doing manual calibration to maximise RX time". Doing an automatic calibration when going from IDLE to RX is easier and you only have to issue one strobe command as opposed to two if you do the calibration manually before putting the device into RX.

    From what you write I do not think the VCO calibration is the problem. To check this you can monitor the PLL lock bit as desribed in section 22.1 in the data sheet.

    Are you sure the data is actually being transmitted and that the receiver is in RX? Check the MARCSTATE on each device.

    A piece of advice: It seems like you have developed HW and SW and you are testing a link straight away. Good practice is to test HW and SW separately.

    HW testing: test RX and TX and antenna separately using common lab equipment such as signal generators, spectrum analyzers and network analyzers. Use SW you know is working when doing HW debugging. One option is to use SmartRF Studio and the SmartRF04EB. Strap control signal from EB to your HW (SPI interface and GDO lines)

    SW testing: use HW that you know is working when doing SW debugging. One option is to use SmartRF04EB + CC1101EM and connect the control signals from your MCU to the  SmartRF04EB.

    Once you have completed all the seprate tests you can set up an RF link. Doing the above will make debugging much easier. 

  • The reason for doing a "manual calibration" is that from IDLE to Rx/Tx with calibration, the state transition time is abt 800usec. whereas, from IDLE to Tx/Rx without calibration, its only takes 90usec.

    Thanks for the advice. Will double check again.

  • Hi Sverre,

    We are also very interested in this matter as we have some very tight real-time constraints in our software and the difference between 800us for the modes IDLE -> CAL -> RX, and 90us for the modes RX -> TX (or vice versa) is really a large difference 9x longer !!

    So our question really relates to how often this CAL is required say in a normal operating environment with temperature variations from 0 to 50 degrees Celcius ?

    Currently we do a CAL at the start, but do an AUTO-CAL every time we go from IDLE to RX etc. This happens regularly every 17,465ms for instance.

    Can we get away with an AUTO-CAL every 5 or 10 seconds for instance ? Where can I obtain more info on this subject (or recommendations) ??

     

    Kind Regards, Mike