This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ78350-R1: SOC Estimation after Reset is not accurate

Part Number: BQ78350-R1
Other Parts Discussed in Thread: BQ78350, GPCCHEM, BQSTUDIO, BQ78Z100, GPCRB

Hello,

we use the bq78350-R1 in one of our new products. We did the Full Optimizationmethod as stated in the SLUUB45A. Now during our End of Line Test for the Pack we check for a reasonable SOC/RC before sealing and found, that the Estimation after a Reset is not what we expect. The Cells are at roughly 3,6V (30% Chargestate) when the Pack is assembled, yet the SOC is Estimated at 5%. 

During a Full Cycle the SOC changes to an accurate Value (MaxError goes from 100% to 2%). The Moment we Reset or Shutdown/Activate the Chip over SMBus the MaxError jumps back to 100% and a new Estimated SOC is present. We checked different Voltages to Reset and cannot find a plausible Explanation of why the Estimation is that inaccurate. For Example at 89% Chargestate the Chip estimates around 80% and with a 75% Chargestage we get an Estimate of 66%.The Error appears to be not linear.

After Reading the Documentation for the Chip we know that the Estimation and MaxError behaviour is expected. We just did not expect those inaccurate estimated values (up to 25% off). With the CEDV_GPC Tool there was also a OCV11 File created with SOCs according to a corresponding Voltage (As stated in other Threads these values are for Debugging purposes only). The Problem is that if the Chip would estimate based on these Voltagevalues in the File the estimated SOCs should be fine. Apparently the Chip does not use those Points even though we used the exact configuration for the CEDV Values we got from the CEDV_GPC Tool.

We are not able to deactivate the Shutdown/Reset Events in the Application so we have to get a somewhat decent accuracy for the estimated SOCs.

I attached a Zipfile with our initial Logs and the exact Data we used with the CEDV_GPC Tool as well as the Report from that Tool. Additionally i included the used Firmware (*.srec and gg.csv) as well as 3 log in wich the change in maxError/SOC can be seen.

Fehlerlog1.log  =>  (At Time 11.04.2018  15:56:36 : A Reset was done and the SOC changes from 89% to 81% as well as MaxError from 2% to 100%

Fehlerlog2.log  => contains a full cycle to get back to accurate values with a final SOC at 75% (after that we went in Shutdown)

after Reset.log => is the Log some Time after the Shutdown with an estimated 66% instead of the expected 75%

I hope i could Detail our Problem well enough to get a fast reply and suggestions in how we can increase the accuracy for the estimated SOC.

Sincerely, 

Robert Andrisbq78350-R1 Firmware, CEDV Results, Logfiles.zip

  • Hi Robert,
    One of our AEs is already looking into this issue following an email from Sebastian.

    Regards,
    David
  • Hi Robert,
    At reset or boot the capacity is estimated using the chemistry data loaded the gauge and the cell voltage. The gauge will attempt to underestimate the state of charge so the system does not expect more capacity than is available. The error is set to 100% so that the system knows it is an estimate.
    If the wrong chemistry for the cells is loaded the estimate will be wrong.
    If the battery was idle and the cells are at rest, the estimate maybe good. If the battery was in use either in charge or discharge, the estimate may be high or low since the cells are not relaxed.
    Also the gauge's estimate will be for the full capacity of the selected chemistry. If the voltage range is reduced to extend the cycle life of the pack, the estimate will not be as good.
  • Hi,

    i did understand the reason why the MaxError goes from 2% to 100% as it is writen in the Technical Documentation. Its just the inaccuracy that is my Problem. The ChemID contains the OCV Tables and i created the Firmware based on the chemID Database because my used Cell is allready available in TIs BaseLibrary. 

    Normally i do a chemID matching despite having the Cell in TIs Database allready. In this Case however at the very bottom in the Descriptionn for the GPCCHEM Tool is a Note that "this Tool is not applicable to CEDV Gauges". I have not found any other ChemID Selectiontool that is suitable and since the seperate CEDV Tool puts out the OCV11-File i assumed CEDV Devices are accurate enough. Furthermore since the Estimation after Reset was not an issue at first (Customer changed the Requirement during final stages) and all my Initial testing was accurate i did not investigate further.

    So if the "Difference" is related to ChemID, what is the recommended way to match the ChemID ? Or even better, is there a Tool for this Gauge to create the ChemID Data from Scratch ?

    Underestimating by a few percent like described in other Forumthreads should not show the behavior like we see in our application. At 3,6V guessing 5% instead of 30% is not an "underestimate" and has to be improved significantly.

  • Hi,

    can i get an update on the current progress ?

    thanks in advance

  • Hi Robert,

    The bq78350-R1 does use the full OCV curve from the chemID, not the handful of points from the OCV11 file.  Matching can be performed the same way as for IT gauges using a C/10 discharge and GPCCHEM.  Or, as you mentioned, you can find your cell in the list and use that chemID.

    First, try this experiment: Use bqStudio and the Chemistry plugin to program your chemID into one of your production packs and reset it.  I expect you will get a more accurate SOC estimation as it sounds like you are using the default OCV table in the gauge.

    Are you programming with a .gg.csv file or a .srec file?  

    If you are using a .gg.csv file to program in production then you are not programming the OCV table since it's in private/static dataflash.  You need to program the chemID using bqStudio as well as any other settings (importing a .gg.csv file will work) and then extract a .srec file which will include the private/static OCV table.  That .srec file can be used in production (although you can just program the top portion which represents the dataflash if you don't need to upgrade the firmware).

    Forgive me if I'm making incorrect assumptions about your process, but hopefully this gives you some ideas.  Let us know if it works or doesn't, please!

  • Hi Max,

    even though we "selected" chemID 278 i did read the actual ChemID through ManufacturerBlockAccess() and to my shock got chemID 1210 as a Result. I promptly checked when the chemID was changed. Apparently there was a change in the Firmware in wich not the "usual" method was used but instead the *.gg was imported resulting in a wrong chemID in the very early stages of Development. Since the Chip only uses the chemID for the Reset estimation in every performed Test further down the Accuracy was fine.

    I now programmed the correct chemID "again" and it did indeed differ quite a bit in estimatet SoC. Right now i charge our Testpack to 3.3V/Cell so i can check the accuracy with the selected ChemID at lower Chargestates.

    Your assumption about our process was not correct in that most our Products we manufacture use the *.srec Programming exactly for the mentioned reasons what makes it that more embarrassing that this slipped through. Would it be possible to add the chemID as a Read only in the DataMemory so it is easier to keep Track of it ? This would be helpfull for all of your gauges using a chemID.

    For the Matchingprocess you might want to add a Clarification on the GPCHEM webpage that even though this Tool is not "applicable" for CEDV gauges you still can get the ChemID for initial OCV Estimation.

    i keep you updated on the matter

  • Hi Max,

    after we found a wrong ID to be the guessed cause for the issue, i checked if there would be a better ID Matching possible. So i used the suggested GPCCHEM Tool. The Result was that ChemID 2129 would be a better Match (max DOD error 1.23 and 0.81% Max R deviation) than our first selected ID 0278. Though the 278 is in the List with deviation below 3%. To be exact the ID 278 is at 1.57 max DOD error with 0.66% Max R deviation as taken from the GPC Output.

    So i did try the difference and i am very suprised about the way this estimation or the Gauging in general really works. I tested the estimation, in that i selected the Chemistry and updated from DB. After update i did a Reset of the Chip to get a forced Estimation of SoC. I got the following Results at 22.465V (All with 2524mAh FCC learned by a full cycle with ID0278) in a minimum of 1h Relaxed" State:

    ChemID 2129 => 14% RSOC , 14% ASOC  (LGC INR18650D2 3050mAh)

    ChemID 0278 => 9% RSOC, 9% ASOC (Sanyo NCR18650PD 2680mAh)

    ChemID 1210 => 7% RSOC , 6% ASOC (Sony Laminate 3650mAh)

    After i got these Results i was a bit shocked and started to check the Data myself. For the cycle i sent to the GPCCHEM-Tool i calculated 2513mAh discharged capacity between the 2 Relaxperiods. The Resulting SOC at the mentioned 22.465V leaves me with 2.5% SoC (Dischargecurrent -271mA). Yes there will be a Relaxation but even if i add the maximum Relaxationdifference i measured at the cut-off (0,7V) i end up at 23.165V Resulting in around 7% SoC. The Temperature of the Pack at those Voltages was around 25°C.

    So my big Questions are:

    • What exactly is in the Chemistry Data and how can i get a reliable estimation? The GPCCHEM tells me ID 2129 and 278 are close in Chemistry but still get me a difference of 5% in both SOCs when nothing else has changed. Are the private/static OCV Tables really fix or are there any other formulas involved in the estimation ? Even if i try to get a relation to the possible capacity it does not make any sense to me.
    • Are there any "hidden" Values for example like the Rb Tables that are not documented and could influence the Estimation ?
    • Is there a way to "create" the Chemistryfile for this Device derived from the actual Packdata?  (Like GPCRB we used for the bq78Z100 in another Project) 

    i put the Input and Result for GPCCHEM as well as my Analysis, with both mentioned Datapoints marked in the *.xlsx File, as Attachement

    I hope to hear from you soon

    best regards

    Robert

    GPCCHEM Result,Analysis.zip

     

  • Hi Robert,
    The voltage-based capacity estimation after a reset is not expected to be highly accurate and due to the flatness of the Li-ion curve a small delta in measured voltage can result in a fairly large delta in capacity estimation. Your range of 6-14% initial RSOC sounds fairly reasonable. This gauge relies heavily on coulomb counting and not so much on voltage measurements. We have so many chemIDs in our database that many are quite similar and it's normal to get many matches to your cell and anything below 2-3% max DOD error in the GPC report will likely give you very similar results in the real world. GPCRB will only modify the resistance portion of the chemID, not the OCV. It's the OCV that you are using when you reset the gauge.
  • Hi Max,

    With this Information i can to my current understanding either stay always active with an accurate Gauging but increased Power Consumption or shut off into DeepSleep (AutoShipment) and accept loss in accuracy when the pack is reactivated. Is this correct ?

    And is it possible to implement a custom Firmware so the estimation is deactivated and we keep the last RSOC ? 

  • HI Robert,
    The bq78350 does not have a feature to use the last RSOC on startup. It is an interesting suggestion. Please reach out to your TI sales representatives about the possibility of a custom firmware.