This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ35100: SOH measurement stuck in EOS mode with Li-SOCl2

Part Number: BQ35100
Other Parts Discussed in Thread: MSP430FR5994, BQSTUDIO, EV2400

Hi!

Equipment: BQ35100 chip is connected according to the datasheet and controlled by I2C via MCU. The GE and ALERT output are also controlled by the MCU. An XBee module is used as a load, which in transmit mode consumes 20mA, which creates a voltage drop of 80mV.
Power source: two SAFT LS14500 batteries connected in parallel. Voltage 3.6V, capacity 5200mAh, Li-SOCl2 chemistry.

I connected the EV2300 programmer via I2C to the BQ35100 and calibrated the current and voltage on the chip according to the SLUUBH7 and SLUA904 manuals in the BQ studio program. Then set battery parameters, operation config A to EOS mode and updated chemistry ID to 0623 (Ra table updated). All parameters can be viewed in the memory dump. Then I unplugged the EV2300 and further tested the new battery to full discharge using the MCU.

Measurement algorithm:
1) Enable GE pin.
2) Read battery status.
3) Send GAUGE_START command.
4) Wait for GA bit.
5) Send a command to the XBee causing a voltage drop of 60-80mV for 2 seconds.
6) Send GAUGE_STOP command. Within 15 seconds, the consumption is minimal.
7) Wait for ALERT to go low due to the G_DONE.
8) Read Voltage and SOH.
9) Disable GE pin.

Then two functions are used to read Scaled R and Measured Z. Each function consists of:
2.1) Enable GE pin.
2.2) Read battery status.
2.3) Read value.
2.4) Disable GE pin.
(Although now I understand that it was better to read the parameters after the main measurement without switching the GE pin)

These three functions were performed every hour. After the measurements, the XBee data transfer function was turned on, which provided a consumption of 10mA. Thus, the battery should have been discharged in about 20 days. However, on day 11, I stopped testing, as the Scaled R and Measured Z numbers became frankly wrong. SOH value did not fall below 91%. I made a memory dump after testing and golden image.

Is there an error in the algorithm? Perhaps the chip settings are incorrect? I would be very grateful for your help in this matter.

30-12-2022_after_11_days.gg.csv

0100_1_02-bq35100.srec.csv

  • Hello Vlad,

    Were the steps in TRM section 5.3.1 Initial EOS Learning completed?

    It i also strange that the voltage of the battery increased in the beginning of the log. Do you know if there is anything in the circuit that could cause it?

  • Hello, Shirish.

    Yes, I completely forgot to mention. After calibration, before starting measurements, I sent the NEW_BATTERY () command.

    I think this is an error in the BQ35100 voltage measurements. I don't know what else could cause such an increase in voltage in 10 days. The device has only one voltage source - two LS14500 batteries connected in parallel. Consumers: XBee S2C module, LED and MSP430FR5994 controller. The temperature is constant, 24 degrees.

  •     Have you tried to add a delay(like 1s) between G_DONE and read Voltage, SOH, ScaledR and MeasuredZ followed by disabling the GE pin 

  • There is no delay before reading the voltage and SOH, I did not try to add it. I read these parameters as soon as the ALERT pin goes down due to G_DONE.

    But I read Scaled R and Measured Z after switching the GE pin (as described in the question, by two separate readings) and after G_DONE, exactly a second passes. Although it is in these indications that the greatest variation in values is.

  •     It is no need to switch the GE pin to read the Scaled R and Measured Z, this 2 values can be read together with voltage and SOH, but a delay may needed before reading all these data

  • I will run another test for >20 days and read all the readings together, one second after G_DONE, thanks.

    As far as I understand, the delay will help to remove erroneous jumps in values. But on my graph, the SOH value is stable for a long time. Do you think SOH did not fall below 91% precisely because of the lack of delay?

  •     The SOH and ScaledR, MeasuredZ are calculated at the same code section, if SOH is stable, then ScaledR and MeasuredZ shall also be stable, I think there might already be an unintentional delay before you read the SOH, reading ScaledR and MeasuredZ together with SOH shall return correct result.

        But only assert GE without gauge_start() command won't get correct ScaledR and MeasuredZ reading. When GE is pulled low after reading SOH, the devise lost its power and the content in the ram shall get all lost.

  • Understood, thanks. I will do as you said and come back with new results. Testing will take about a month.

  • I have fixed the bugs described above. Now the measurement algorithm is performed by a single function and looks like this:

    1. Relax for 1 minute, during which the consumption is minimal (microamps). In order for the battery voltage to increase slightly and the voltage drop to be large.

    2. Enable GE pin.

    3. Read battery status.

    4. Send GAUGE_START command.

    5. Wait for GA bit.

    6. Maximum consumption within 4 seconds.
      For the current version of the board, this is a current of 16mA. This causes the voltage drop on the LS14500 battery from 3590mV to 3550mV. This is provided that the battery was under load, since the voltage on a fresh battery is 3670mV and the voltage drop will be more than the required 100mV.

    7. Send GAUGE_STOP command.

    8. Minimum consumption within 20 seconds for the correct calculation of the chip, as indicated in the manual.

    9. Wait for ALERT to go low due to the G_DONE.

    10. Delay 1 second.

    11. Read Voltage value.

    12. Delay 1 second.

    13. Read SOH value.

    14. Delay 1 second.

    15. Read Measured Z value.

    16. Delay 1 second.

    17. Read Scaled R value.

    18. Delay 1 second.

    19. Read Internal Temperature value.

    20. Delay 1 second.

    21. Read Current value.

    22. Delay 1 second.

    23. Read Short Trend Average value.

    24. Delay 1 second.

    25. Read Long Trend Average value.

    26. Disable GE pin.

    Then a constant consumption of 5mA is provided for 2 hours and then measurement again.

    On the consumption chart it looks like this:

    Such operation of the system should discharge a 5200mAh battery in 1 month and 12 days. That's about 2 percent a day.

    I calibrated the current, voltage and temperature readings on new board, set up all the necessary parameters and ran the test with a slightly discharged battery with a maximum delta SOH of 5%. In two measurements, the SOH value dropped to 88% and the value did not change for three days. At the same time, the readings of other parameters look valid.

    I attach a memory dump of the BQ35100 chip made before the test was run.

    Is the problem a small voltage drop? The voltage drop is only 40mV instead of the required 100mV. Or is it the wrong settings in my configuration? I am concerned about the value of "Cell Terminate Voltage" which is 2000 mV.

    config_before_testing.gg.csv

  • Hi, 

        For the scaledR value, it is getting lower over time from your logged data, it is not expected in normal application, this is because the scale changed from the 3rd row of your logged data, the scale used to scale the measureZ to Scaled R is 1.60x after 4rd row, this caused the following impedance always below 4xxxmOhm, lower Scaled R means higher SOH, but SOH is not allowed to increase, so it is stuck at 88% until the scaled R get greater than 4374

        Per you load profile, it seems to be very dynamic, the load fluctuates up and down at the interval from ms to s level, the ADC sample current and voltage at 8MS conversion rate, I wonder if the impedance can be calculated stably under such dynamic load. From you result, the impedance varies up and down randomly overtime. I wonder the SOH calculated with such impedance would be referenceable, please read below NOTEs in the TRM:

    "In EOS mode, the accuracy of the SOH reported value can vary significantly with a load profile. Perform in-system evaluation to determine the reported value at the desired EOS level. In some instances, the value of SOH should be ignored."

     Also please note the when a new batery is inserted, then the NEW_BATTERY() command should be sent to the device to ensure the initial learned resistance RNEW is refreshed correctly

      If the SOH is not referenceable, the major useful information for EOS mode are Short Trend Average and Long Trend Average, with which the gauge used to determine whether to assert Battery Status[EOS] 

  • , thank you for the answer.

    I stopped this testing and did a memory dump. I got the value "R Table Scale" = 711. Is this normal?

    A very good point about impulses. I included a load resistor in the circuit and the pulse became stronger and without bursts. I'll use a new battery and the current will be 70mA (which causes a voltage drop from 3590mV to 3470mV). I'll run the test with it and also will describe the results.

    I also used the NEW_BATTERY() command in previous test to reset the EOS-related data. I forgot to mention it in the message.

    config_after_3_days_testing.gg.csv

  •     This is expected, the 0.711 is exactly the reciprocal to 1.406, the scale 1.60x in my previous post is a typo, if you check the ratio of the Measured Z and Scaled R, you will see a ratio of 1.40x 

        As it is explained in the TRM, the load profile could be too dynamic for the SOH based on scaledR under EOS mode to be referenceable, you can use the short trend average and long trend average to determine the EOS alert if you still see a dynamic SOH data

  • , thank you for the answer.

    Yes, I am aware of EOS alert, but I need to configure exactly the correct SOH value.

    Can I ask you a question about the "Terminate voltage" parameter? I sent the NEW_BATTERY() command, set the value of SOH delta = 10% and changed the "Terminate voltage" to 3300 mV (it was 2000mV before) and ran a new test with a resistor as a load. As a result, SOH dropped to 0% in 10 measurements. Please tell me how to set the value "Terminate voltage" correctly? My device will not work if the voltage is less than 3300mV, so I thought this is it.

    1001_config_after_testing_new_battery.gg.csv

    1001_config_before_testing_new_battery.gg.csv

  • Yes, the Terminate Voltage means the discharge cut off voltage. But the actual voltage table used by the gauge for LiSOCl2 has the voltage ranges from 3.145 vdown to 1.920v which causes the internal calculation always step down by 10% each time whatever the ScaledR is.

    Per the OCV table, a cell with voltage ranges from 3.145 to 1.920 would be appropriate, otherwise the hidden OCV Table has to be modified

    It also looks strange for the ScaledR decreasing during discharge, for the voltage decaying from 18:32 to 19:47, is this done with a constant load?

    Could try to add a delay like 1s between step 2 to step 3(between step 5 and step 6 can also be tried) so that the gauge can read an open circuit voltage before sending Gauge_Start() command?(The Load should still be pulsating rather than constant)

    The impedance should always increasing while discharge is underway 

  • , thank you for the answer.

    1) How to change the internal OCV table? There is no such information in the manual. Only OCV table for LiMnO2, the Ra table is updated for LiSOCl2.

    2) Do I need to change the OCV to 3670mV - 3300mV and set the terminate voltage to 3300mV?

    3) For yesterday's test from 18:32 to 19:47, I turned on the device, the above algorithm started (but the load in a 2-second pulse is constant without peaks and 70mA instead of 16mA) and I saw that SOH = 90%. Then I turned off the device, waited a couple of minutes and started it again (without the NEW_BATTERY() command) and got 80%. I turned it off and repeated the procedure again until I got 0%.

    4) I will add pauses and run test again.

  •     You can press the Chemsitry button on the tool bar of bqStudio, then you will see a tab window for chemistry ID selection and download like below:

     \

    Find the battery model matches to what you use in the project in the Column of "Model", select the row and click the button below to download all chem ID related data to the target device.

  • Yes, I did it at the beginning according to the manual, before creating the post. Even updated the database version. But this only changes the Ra table and the value of "Static Chem DF Checksum". Is not it so? I compared two configs before and after the flashing chemistry.

  •     The OCV table is hidden and not readable, what is the ID you used to download the chemistry data to the target device? What the cell modal type you are using

  • Steven Yao

    Information in the first message:

    Power source: two SAFT LS14500 batteries connected in parallel. Voltage 3.6V, capacity 5200mAh, Li-SOCl2 chemistry, chemistry ID - 0623.

  •     Can you extract the srec file and upload the file here?

  • I just made a full dump of the chip, which is being tested for 3 days with new battery. It has the value "Terminate voltage" = 900mV. Also the value of "EOS SO smooth Start Voltage" = 900mV.

    Note: During the 9th pulse I touched the battery with a multimeter to see the voltage drop. It turned out 120mV. But it was at this point that SOH dropped to 91%. Can my interference cause this?

    3678.config.gg.csv

    0100_1_02-bq35100.srec.removethis.csv

  •     The measurement result could be sensitive to noise, it had better to make sure the measurement result does not affected by occasionally interference like touch some points on the circuitry or somehow moving or touching the connecting wire from or to the board  

  • Understood, thank you! I won't interfere with measurements anymore.

    What about the "srec" file? Is the data there not valid? I changed the file extension so that it could be attached.

  •     The chem ID data seems to be downloaded correctly, it confirms to what it is expected to be. but the 3.3v terminate voltage is too high, because the impedance of this cell is very high, a 70mA current could pull generate 200mv more IR drop across the internal resistance, a lower terminate voltage shall be used

  • Thanks for the answer!

    I start another test with the standard "Cell Terminate Voltage" = 2000 mV. I calibrated the parameters using EV2400 and sent the NEW_BATTERY() command. I also added the pauses you mentioned above and remove pause for relax.

    Now the algorithm is like this (I have highlighted the changed points in bold):

    3.1) Stop all consumption.
    3.2) Enable GE pin.
    3.3) Read battery status.
    3.4) Wait 2 seconds.
    3.5) 
    Send GAUGE_START command.
    3.6) Wait for GA bit.
    3.7) Wait 0.1 seconds.
    3.8) Maximum consumption within 2 seconds (70mA).
    3.9) Send GAUGE_STOP command.
    3.10) Minimum consumption within 20 seconds.
    3.11) Wait for ALERT to go low due to the G_DONE.
    3.12-3.31) Cycle:
        - delay 1 second;
        - read parameter value;
    3.32) Disable GE pin.
    3.33) Set the consumption to 5.2mA for 2 hours.

    At first everything went by the book and the SOH value was very similar to the truth. But then my device rebooted for other reasons and it sent the NEW_BATTERY() command again... I'll have to restart the test with new batteries.

    before_test.gg.csv

  •     Before the row with the data logged at 02.12.2023, 20:06, the SOH decreased as expected, if the jump at this row is caused by reset, then there seems to be no anomalies on SOH