This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/BQ32002: RTC time lag

Part Number: BQ32002


Tool/software: Linux

Hi,

We are using the BQ32002 RTC  in our custom board. We are facing issue with the time which is getting a lag of 5 secs after the board is turned ON after 1 day.

How to correct this timing delay or is there any calibration need to be done for correcting this delay.if so please mention the calibration procedure.

Thank you,

Deepanraj.A

  • Hello Deepanraj,
    does the oscillator fail flag indicate an oscillator fail? Do you observe frequent access to the calibration register which can be used to slightly speed up or down the frequency?

    Best regards,
    Patrick
  • We are not getting oscillator fail it is always zero and There is no frequent access to calibration register.We have observed this issue in Multiple boards which was supported with bq32002 IC.When the Device is in power ON state we do not seemed to face this issue.But once after time is set and board is power OFF and again Powered ON after 1 day we are getting this issue.

    Device configuration - we are using i.Mx6 processor, RTC is connected to i2c1 of the processor.External battery is connected along with the Board.

    Frequency plan -  32.768KHz with + or - 20ppm crystal.


    Thank you,
    Deepanraj

  • Hi Deepanraj,

    You can calibrate BQ32002 using the CAL_CFG registers to increase the RTC accuracy. Here is the calibration procedure and further explanation: e2e.ti.com/.../145998

    Kind regards,
    Lane
  • Hi Lane,

    We are not able to access the above link provided by you.

    Thank you
  • Hello Deepanraj,

    here the content of the other post:

     The pre-calibration accuracy of the BQ32002 is +35ppm. This translates to ~3 sec/ day. This value varies with the crystal used on board. We used KDS DMX-26S surface-mount 32.768-kHz crystal.

    If the IC is calibrated (using the CAL_CFG registers) as mentioned below, the accuracy can be improved to <5 ppm. This translates to < ½ sec/ day

     

    Calibration:

    You can use CAL-CFG1 register to adjust the calibration dynamically. When you modify the CAL_CFG register, the BQ32002 adjusts the timing every 8th/ 16th minute to speed up or slow down the Time.

    Usually, the customer calibrates in the factory once.

    For a given setup the calibration values required are the same.

    However, if the IC is going to experience a varied temperature, the oscillator will drift with this temperature.

    This will require a different set of cal values.

    You could use a temp_sensor and change the values inside the IC based off the temperature.

    Best regards,

    Patrick

  • Hi Patrick

    We are using ABS07-32.768KHZ-T crystal in our design which supports +-20 ppm.

    Based on the crystal batch used during production, ppm can be either plus or minus.

    Since RTC register has option for setting plus or minus ppm, how  to identify which ppm value to set during production?

    Thank you,

    Deepanraj.A

  • Hello Deepanraj,
    you will need a reference to verify against. Ideally this is coming from oscillator which is tuned to atomic clock, ethernet, gps and alike.
    I would assume you may want to calibrate again in application. E.g. your iot part was powered down a real long time and temperature changed etc. Xtal may have slightly shifted again to fit to "absolute time".

    Best regards,
    Patrick
  • Hi Patrick,

    What value of ppm do i need to set for a 5 sec delay in the time.Please let us know.

    During Production how do we identify the calibration since it may vary depending up on the crystal used, it may have delay or it may increase.

    Thank you
    Deepanraj

  • Hi Deepanraj,

    Calibration value will vary from board to board and requires a measurement. The basic flow to calibrate the frequency is as follows:

    1. Configure the IRQ pin to 1 Hz by setting FTF bit and FT bit to 1.
    2. Measure the 1 Hz output signal with an appropriate precision frequency counter within the resolution required (suggest at least 12-digit counter).
    3. Compute the absolute error in ppm: Absolute Error (ppm) = |106 × ( fMEASURED – 1Hz) / 1Hz|.
    4. Adjust the frequency, by performing the following:
      1. If the frequency is too low, set S = 1 and apply the appropriate CAL bits, where CAL = Absolute Error (ppm) / (106 / 245760), rounded to the nearest integer.
      2. If the frequency is too high, clear S = 0 and apply the appropriate CAL bits, where CAL = Absolute Error (ppm) / (106 / 491520), rounded to the nearest integer.

    NOTE: Because the frequency change is small and infrequent over a very long time interval (observable every 8 or 16 minutes depending on the calibration sign), it requires some patience to observe the calibration effect on a precision frequency counter.

    Example 1:

    Assume the measured IRQ output is 0.999933203125 Hz. The frequency error is about 66.8 ppm low. To increase the frequency by ~66.8 ppm, S would be set = 1, and CAL would be set to 16 (66.8/4.069).

    Example 2:

    Assume the measured IRQ output is 1.0000244140625Hz. The frequency error is about 24.4 ppm high. To decrease the frequency by ~24.4 ppm, S would be cleared = 0, and CAL would be set to 12 (24.4 / 2.035).

    The calibration corrects only initial offsets and does not adjust for temperature and aging effects. This can be handled by periodically measuring temperature and using the crystal's characteristic curve to adjust the ppm based on temperature as required.

    I hope this helps.

    Kind regards,
    Lane