This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ40Z60 - Fuse triggering even with FUSE_EN set to 0

Other Parts Discussed in Thread: BQ40Z60

Hi

I'm using the BQ40Z60 evaulation board to charge, protect and fuel gauge a 4 cell stack of 1750mAh lithium cells.

I am having trouble with fast charge currents. With the charge current set to around 500mA the device performs as it should. However, when I increase this to around 1000mA the charger oscillates between charging and asserting the fuse (and therefore turning the charge and discharge FETs off).


The strange thing is that no safety alerts are triggered, so it is proving difficult to track the problem.  The image below shows the bit registers just as the charger starts to charge again, but as you can see FUSE is also asserted (it either occurs at the same time or about 0.5 seconds after it starts charging).  This is also strange behaviour since the FUSE_EN is low, which should mean that the fuse is never asserted.


The next image shows the bit registers just before it starts to try and charge again.

Does anyone know what the problem could be? It may be that I've been staring at the green and red blocks too long to notice any problems - any help is much appreciated.

Kind regards

Ryan

  • Ryan,
    Are you increasing the current limit on the data flash and then the charger cycles on and off? Can your power supply provide enough current to support the charger without hitting current limit?

    Tom
  • Hi Tom,

    Thanks for the reply. The power supply is a lab unit set to constant voltage mode, with the current limit being sufficiently higher than the requirements of the charger.

    Yes, exactly - it is oscillating between charging and shutting off. I am increasing the charging current in the Advanced Charge Algorithm settings (for "rec Temp Charging" since that's the temperature range I am operating in). The "Max Current Register" is set to 77, as the charger that will be used is capable of providing about 3A, so Max Current Register = I_limit x R_chg x 2550 = 3 x 0.01 x 2550 = 76.5 ~ 77.

    Kind regards
    Ryan
  • Ryan
    Can you export a gg.csv file for me to review? Also, provide log data showing the cycling, if you can.

    Tom
  • Hi Tom,

    Sure thing. Please see attached, there is one cycling log file and associated gg.csv, and one for non cycling operation. Let me know if you need anything else.


    Thanks a lot for your help!

    Kind regards

    Ryan

    BQ40Z60_Data.zip

  • Ryan,
    It looks like the cell voltage is tripping the 2nd level protection. Try placing a short across C10 to disable it and see if that helps.
    Tom
  • Hi Tom,


    Nailed it - I completely overlooked the secondary protection since the circuit I've designed doesn't encorporate it. It seems so obvious now as that's why no safety alerts were triggering.


    I have a related question with regards to top and bottom cell voltage i.e. cell 1 and cell 4 in a four cell stack. When charging, it seems that cell 1 and cell 4 are read to be at a higher voltage than cell 2/3. Then when discharging they are measured to be lower than cell 2/3. This can effect the fuse tripping or the overvoltage protection tripping at low charge currents.


    I think this is due to a large series resistance on the cells as in the following diagram:

    I've read that this can be due to the Nickel tabs on the cells, as Nickel has much higher resistivity than copper (I've ommitted the resistance on the other cells, as they don't seem to have an effect on the voltages). Or I was thinking it could be due to the internal series resistance. Anyhow, here is a cell voltage graph when charging at only 500 mA (I'm using 1750mAh cells), the voltage difference is much higher when charging closer to 1C.

    Since I cannot set different cell voltage trip points (this would be a bad solution anyway, since different charging currents would effect it), I'm assuming that I need to do some sort of calibration to account for this resistance - allowing a more accurate estimate of the actual cell voltage from the measured cell voltage that I have access to read.


    Any pointers in the right direction would be greatly appreciated, and thank you so much for looking into my earlier problem - it was driving me crazy.

    Many thanks

    Ryan

  • I've just gone back to my original notes and have now remembered that there is a voltage calibration section in Battery Management Studio. I'm pretty sure I did do this originally but I'm going to do it again since a lot has changed since then (the fuse and overvoltage problems distracted me from looking towards the simple solution).

    When carrying out the voltage calibration, what values does it change in Data Memory?
  • The Cell Gain, Bat Gain and Pack Gain calibration parameters.

  • Thanks. These do not bare any relationship to the charge or discharge current on the battery pack, so are not fixing the problem I highlighted above. See below another graph I have produced:

    The voltage being read for Cell 1 is almost 4.6 V during high charge currents. Cell 4 suffers more when discharging. Am I right in saying that this is due to a series resitance? (see above). If not, any help explaining this would be greatly appreciated.

  • Can anybody help with the above?

  • Ryan,

    The series resistance will cause the voltage to shift between charging and discharging. You can think of a battery as a capacitor with a series resistance. If the battery is unloaded, then the voltage across the terminals will be close to the "capacitor" resistance, because there is no voltage drop across the series resistor. If you apply a CC/CV type charger, then the voltage at the battery terminals should be the capacitor voltage plus the IR drop across the series resistor. The terminal voltage will increase as the capacitor charges until the terminal voltage reaches the charger voltage. The current will then start dropping as the capacitor continues to charge and the voltage drop across the series resistor drops. The same principle applies to discharging. The larger the discharge current, the larger the voltage drop across the series resistor. This will lower the terminal voltage and reduce the effective capacity of the battery. 

    Tom

  • Hi Tom,


    Yes, that makes sense. Thanks for the detailed explanation.


    So my question off the back of that, is how do I adjust the BQ40Z60 firmware to account for the change in voltage as a function of charge/discharge current? i.e. where do I place the series resistance value?

    Basically, I want to allow high charge currents without worrying about the cell over-voltage tripping due to the series resistance (and I certainly don't want to increase the trip voltage setting to allow high charge currents).


    Thanks again for all of your help so far!

    Kind regards

    Ryan

  • The gauge and charger do not have the option to use a series resistance to adjust the charging voltage. The cells should be fairly well matched and they should track one another as they charge. The gauge also has cell balancing to try to keep them balanced while charging. The bq40z60 is am NVDC charger, so it will step the voltage up as the cells charge.
  • I'm not saying the charge voltage should be adjusted based on the series resistance - in fact I don't have a problem with the full cell stack voltage measure. But the I*R voltage drop at the top and bottom of the cell stack causes an error in the measured voltage of cell 1 and cell 4 (it is only every them two, and I've tested with lots of different battery packs), that error being directly proportional to the load current (charge or discharge). For example, if I increase the current to something high like 1C, the measured voltage on cell 1 increases significantly - sometimes to around 4.6 V.

    The reason I care about this is that I know that the cell isn't actually at 4.6V, I know that what I am reading is {V_cell + I*R}. I want to ensure that the over-voltage protection only trips when an actual over-voltage condition has occured, i.e {V_cell} is higher than the firmware setting. Otherwise, if I set the over-voltage protection to trigger at, say, 4.3V, the charging will be fine until the capacity gets over around 50% and then the firmware will cycle between a charge state and a COV safety alert state on cell 1 - giving me lots of charge bursts depending on the delay setting.

    To summarise, my over-arching question is: How do I charge at high currents - which are still below the recommended maximums - without triggering an over-voltage condition on cell 1?

    Please let me know if any of the above is unclear and I will try to give more detail.

  • Maybe wires from cells to PCB have high resistans. Can you mesure its?