This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ34Z100-G1: Charge efficiency lead acid PbA

Part Number: BQ34Z100-G1

Support Path: /Product/Development and troubleshooting/

SLUSBZ5B (revised July 2016) explains that the bq34z100-G1 now has flash settings to allow for the fact that charge efficiency in lead acid batteries decreases with state of charge and is influenced by temperature. However, there appear to be many errors in the coverage of this subject throughout SLUSBZ5B. Also, the values provided for charge efficiency calculations in this document (and probably also stored as default in the device's flash memory) are different from those popularly said to apply to real batteries. At full charge and low temperature they might work fairly well, but probably less so in other conditions. Users therefore may wish to change the stored values, and it would be easier for some of us to do this if conversion factors were provided, perhaps in the same form used for table 12 on p29 of SLUSBZ5B. My guess is that Pb Temperature Compensation (subclass 34 and offset starting at 5) would be converted from DF value using the formula 1280 * DF. If so, to convert from temperature increase in K or C (above 25 C) to charge efficiency reduction in something (maybe 0.01% per C) the temperature change is divided by 1280 then converted to Xemics floating point form before being saved in flash. Similarly, my guess is that Pb Reduction Rate (subclass 34 and offset starting at 10) would be converted from DF value using the formula 80 * DF. So, to convert from state of charge increase (above Pb Drop Off Percent) to charge efficiency reduction in something (maybe 0.1% per %) the SOC increase is divided by 80 then converted to Xemics floating point form before being saved in flash. (All these figures are decimal.) However, these guesses are probably completely wrong so I would welcome correction. Whatever information can be provided would be useful in any form, so if someone could just say what the default (or stated maximum) flash contents represent and in what units (0.1% or 0.01% or whatever) then it should be possible to sort out at this end how to make any necessary changes. Thanks for any help you can give. All best, Tim.

For other users (especially those not using the IT evaluation hardware and software) conversion to Xemics floating point is explained inHow EVSW Display the Raw Data V1.03.pdf

e2e.ti.com/.../How-EVSW-Display-the-Raw-Data-V1.03.pdf

There is VB script for conversion on page 4 of slva148a.pdf

www.ti.com/.../slva148a.pdf but the flow diagram on page 8 of SLUA640 should not be trusted.

  • Thanks Tim! We will review this and make any needed changes.
  • That would be great. Thanks. Also, as rate of charge is said to influence charge efficiency to some extent, maybe the facility to take account of that could be built into some future firmware and documentation update. Feel free to ask me (maybe by email) if you'd like me to comment on current or proposed mentions of charge efficiency in SLUSBZ5B. All best, Tim.
  • …but obviously, any future setting for the degree of effect of charge rate on charge efficiency will need to be centred on whatever is the current default for that unused flash location (presumably zero) to represent some fairly ordinary effect of charge rate on efficiency. Otherwise, users of future chips who modify only the old settings because they don't know about any new one(s) will get unfortunate results (and vice versa)!

  • I have now got around to looking at the default settings related to change in PbA charge efficiency with state of charge (SOC) and with temperature, as stored in the BQ34100-G1 flash by TI at the time of chip manufacture. I see the stored values are so extreme that these functions cannot really be working. Since we users do not have the information to correct this, for the time being we will have to give up on any dynamic control of calculated charge efficiency. Instead, we’ll have to put up with some static value intended to suit the average SOC and average temperature the battery will experience in use. I reckon most people can just disregard the effect of temperature on charge efficiency because it said to be small in PbA batteries. However, the effect of SOC on PbA charge efficiency is pretty hefty. It is said that lead acid batteries should not be routinely discharged below about 60% state of charge (and ideally higher). Over that allowable range between 60% SOC and nearly 100% SOC, the average charge efficiency might be coarsely assumed to be about 70% efficient (half way between 100% at 60% SOC and 40% when completely full). If the battery is never to be discharged more a few percent below full (for example, if it is intended only for emergency use) then as low as 40% charge efficiency might be a better assumption. For the BQ34100-G1, subclass 34 offset 4 looks like the right place in flash to put whatever value you choose. This will have to do until TI tell us how to set up the proper charge efficiency settings. (However, the implications of the current literature are that the chip’s methods for dynamically calculating change of charge efficiency with temperature do not agree with formulae suggested by other sources, suggesting to me that fixing that particular minor issue might eventually need changes to firmware as well as to settings.)

  • Does anyone know whether Texas Instruments have done anything to deal with this PbA charge efficiency problem in the BQ34Z100-G1 and associated literature yet?
  • I am working on a NiMH design, and I also need accurate information to correctly calculate the values for Charge Efficiency.

  • Hi Larry,
    You can evaluate the charge efficiency values by couloumb counting the charge into and then out of the battery.

    thanks
    Onyx