This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Fuel Gauge for LiFePO4

Other Parts Discussed in Thread: BQ40Z50-R1, BQ40Z60, BQ76920, BQ34Z100-G1, BQ78350-R1

I am looking for a solution to measure SoC and voltage of a battery pack that is made of three in-series connected LiFePO4 cells. Charging functionality is not required. I expect to have over-current, over-temperature, over-voltage, short-circuit and under-voltage protections as well as cell balancing feature implemented. A solution should be able to cope with short high current spikes and extreme ambient temperature variations.

I have a few questions:

  1. I have found the following ICs that supposed to fit my requirements. What would be the advantages/disadvantages of each, i. e. how to choose between them? Are there any other ICs that I have missed?
    1. BQ40Z50-R1
    2. BQ40Z60
    3. BQ34Z100-G1 + BQ76920
  2. I will be using A123’s ANR26650M1-B cells. Battery Management Studio lists three chemistry options for that particular cell: “26650M1B (2500mAh)”, “ANR26650M1-B (2500mAh)” and “ANR26650M1-B Consult TI before use (2500mAh)”. How should I choose the right one?
  • Jonas
    Any of those gauge options will work with LiFePO4 cells. I would probably use either the bq40z50-R1 or the bq40z60. Their gauging algorithms have some improved features to support LiFePO4 cells and they will be single chip solutions. Since a charger is not required, the bq40z50-R1 will be the best option. The other option will be the bq78350-R1 and the bq76920. This will be a CEDV gauging solution, instead of Impedance Track. If you can provide more detail on your usage profiles and discharge rate, then we can fine tune the choice.

    ChemID 440 is probably the best chemistry option. ChemID 453 was a special characterization that limited the lowest discharge voltage to 3V.
    Tom
  • Thank you for a prompt response. I agree that a single chip solution is a better option. Correct me if I am wrong, but CEDV cannot offer the same level of accuracy and certainty of SoC that IT does. That leaves two options: BQ40Z50-R1 and BQ40Z60. I was under the impression that both ICs offer charging capability and hence was wondering what makes one better/more suitable than another. I guess I was confused by a "Sophisticated Charge Algorithms" feature listed on the first page of BQ40Z50-R1 datasheet. Do I understand it correctly: BQ40Z60 = BQ40Z50-R1 + charging capability?

    Expected average discharge rate = 0.5-0.75A, bursts <= 24A @ 250ms(max). Device will operate for a short period of time mainly in extremely cold environments.
  • The bq40z60 is basically the same gauging algorithm as bq40z50-R1 and the device adds an integrated charger. Your discharge profile is going to be a challenge, so I recommend trying the gauge to make sure that the performance will be acceptable. Both gauges accumulate charge continuously and integrate it every 250ms. The four 250ms values are averaged and reported as Current every one second. Also, you are going to find that the gauges will downgrade the capacity when operating at cold temperatures and the capacity estimates will increase as the temperature returns to normal. It may take several hours for the capacity adjustments to complete.
  • Can you please clarify:

    1. How many times is current sampled within one second? Common logic and some replies within this community (like this or this) suggest that ICs like these cannot sample only 4 times a second (every 250 ms). I expect it to make many thousand readings a second, then average it and update every 250 ms or so. Update rate is not as important if enough samples are captured. Datasheet only says "data updates in 250-ms intervals", but it does not explicitly tell anything about sampling rate. This is a very important figure, could you please let us now what is the sampling rate?

    2. Let's say ambient temperature changes from +25C to -25C within a couple of minutes. Will it take several hours before a fuel gauge will be able to give at least approximate capacity of a battery pack? If that is the case, then it is unacceptable. I need to know if my solution will be able to complete its task before it starts. I can only do that if reported capacity is more or less accurate every single time I read it. Fuel gauge is constantly monitoring ambient temperature, why would it take a long period of time to adjust capacity when temperature fluctuates?

  • The CC filter uses the LFO clock and the conversion occurs every 250ms.

    The gauge will run an FCC simulation at the Ra grid points during discharge and it will update FCC and RM to compensate for temperature changes and other things. The Ra grid points occur at approximately every 11% SOC. The gauge will run an FCC simulation every 5 hours when at rest and update FCC and RM to compensate for temperature changes.
  • Hi Jonas,

    Regarding the current sampling, that is true - the internal data converter actually samples the signal at 10's of ksps and up, but the update is only provided at the much lower rate after digital lowpass filtering.

    Terry