This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Questions and Issues with BQ27501 EVM

Other Parts Discussed in Thread: BQ27500, BQ27501, BQ27500-V130

Hey,

I am using the bq27500/1 EVM to, well, evaluate the fuel gauge.  My set-up includes a DC power supply and electronic load, both controllable through GPIB, as well as a second DC power supply, DAQ, sense resistor, and a 1500mAh cell phone battery.  I have had a great number of questions and issues that I have sorted through already, though these remain:

(bqEASY wizard questions)

  1. During the bqEASY set-up wizard on page 2A, "Cell Characteristics", there are two columns of data in which to enter values.  The first is for "PACK A" and the second for "PACK B".  I have yet to find any documentation explaining appropriate values to enter in the event that one has only a single type of battery they wish to 'learn'.  For instance, I have a 1500mAh Li-ion cell phone battery, so for pack A I entered 1 (cells in parallel), 1500 (nominal capacity), and 3000 (min rated voltage).  I don't care about pack B so I entered all zeros -I have no idea if this is alright.

  2. On page 2H, "Miscellaneous Information", there are resistor ID values to be entered.  I have yet to find any documentation explicitly stating what this actually means.  I don't know the "value of Resistor ID A" (or B if there was one).  I have the defaults 200 ohms and 7500 ohms.  I believe when I performed the last learning cycle I simply entered zeros here as well.  What is a Resistor ID? Is it appropriate for me to simply enter zeros?

  3. On page 4A, "Use Default Chemistry?", I'm unsure whether its appropriate to simply used the default chemistry and moved on.  The battery I'm using is Samsung, though manufactured by SDI (noticed in the chemistry list), however I did not see the numbers in the list correspond to any of the numbers I see on the back of my battery ("EB575152YZ", "DPQ DC101209", "S/N: AA1ZC09xS/D-B").  I have been told by colleagues that most cell battery's are LiCoO2, so in the previous learning cycle I chose the default.

  4. On page 5A, "Learning Cycle", when I click the "All Done" button after completing the cycle I get the error regarding "Update Status1" not being 02.  I had already gone through a few learning cycle failures before, so in my most recent attempts I have made sure to watch the process like a hawk and ensure the bits are being 'cleared' and 'set' when appropriate and that certain values related to pack A (ie. "Update Status0") are correct.  My most recent learning cycle, as I just mentioned, still failed because of "Update Status 1".  To get around this I've chosen to manually enter 02 into the field -everything is A-OK after this.  I wanted to check that my action here is appropriate.  If not, how else do I achieve a successful learning cycle?





    (fuel gauging process questions)

    I should quickly describe my set-up before asking this question.  I have my GPIB controllable power supply and load daisy-chained together, so they both clip to "Load+" and "Load-" (this hasn't been an issue).  I have my second power supply providing 2.5V to "2.5VIN" and "VSS" (the jumper is on external). Junction 8 (J8) has a thermistor from "T" to "Pack-".  At junction 9 (J9) I have a wire from "Pack-" to the negative battery terminal and "Pack+", to a 0.1ohm sense resistor, and then to the positive battery terminal. There is a DAQ measuring the voltage across the battery terminals and across the sense resistor (used for current measurement). I wrote some software to control the load and power supply according to the current and voltage reported from the DAQ.  All my data is recorded as well.

  5. When I discharge a battery from fully charged to 3.0V at a C/3 rate (500mA) I noticed that the battery voltage reported by the evaluation software (EVSW) is noticeably offset from the actual battery voltage.  For instance, when my program terminated the discharge at 3.000V the EVSW reported 2.845V.  This is obviously not a trivial problem given the importance of the 3V termination point.  At low currents this offset is negligible, as one would expect, but certainly not at expected currents (>300mA).  Given I expect this voltage offset is due to a tiny resistance of less than 0.3ohms (lines and sense resistor) I can't help but wonder if there is some essential criteria I overlooked such as, "the battery must be as close as humanly possible to the pack terminals".  Is this the case?  Otherwise, is there some field in which I can directly enter the value of the resistance between the pack terminals and the battery?  If there is, I have never heard it mentioned.

  6. What is the explicit difference between Nominal Available Capacity and Remaining Capacity?  I know that one is "compensated" and the other is "uncompensated at C/20", but I don't know what I'm supposed to take-away from that definition.  What is the compensation process?  During the discharge process I mentioned earlier the two values are quite different and I just would like to have clarification as to why this occurs.

Thank-you tremendously for any help you can provide!  To help you answer questions 5 and 6 I've attached 3 graphs to illustrate the data collected during the discharge process I spoke of.

Daniel

  • Hi Daniel,

    It sounds like you are using bq27501 FW programmed into your EVM, but you really want to be using bq27500 which only supports one battery type and won't ask you to enter info for Pack B and won't require two learning cycles (one for each profile) in order to complete bqEASY.  Resistor IDs are how the bq27501 differentiates between Pack A and Pack B when they are inserted.

    Don't try to fake a successful learning cycle by changing Update Status to 02!  Switch to bq27500-V130 firmware (refer to SLUA453 to re-flash your EVM firmware) and then go through bqEASY again.

    You might be okay using the default chemID but to truly optimize the accuracy you would need to do some characterization to identify which chemID is really the closest match.  There have been a proliferation of battery profiles as every vendor has a different recipe these days.  However, most LiCO2 are fairly similar which is why we introduced the bq27410 and bq27425 that use a default chemID and don't let you change it.  You also don't have to do a learning cycle since it will just scale the parameters based on your capacity and then can further refine the resistance table while learning during field operation.  You might want to take a look at those two products.

    There will be some offset voltage when large currents flow due to the sense resistor and any other serial resistances.

    NAC is the theoretical maximum capacity you have remaining at a very light load and at room temperature since there would be little energy lost due to internal impedance of the cell.

    RemCap is the predicted actual usable capacity you have remaining and is calculated by the fuel gauge through simulations based on the capacity and impedance profile that it has learned about the cell and using the actual load rate and temperature.  The load used in simulation depends on how you configure the gauge.  It can use the current or power at that instant, or a rolling average load, or an average load from a previous discharge period, or a fixed load, and so on.  RemCap will always be less than NAC due to the internal impedance of the cell.  SOC = RemCap/FCC.

     

  • dMax,

    Thank you very much for you reply, it was very helpful.  I have some additional questions, if you don't mind.

    1. The bq27500/1 EVM I have has been kicking around for a while now and though it was thought to be the best when initially purchased, in comparison to your description of bq27410 and bq27425 (which I haven't yet looked at, but I will), that seems to no longer be the case. What is the most sophisticated, most accurate, latest and greatest fuel gauge that TI has to offer?  It is very important that I be evaluating the best technology offered for my applications -being highly accurate fuel gauging of Li-ion cell phone batteries.

    2. In regards to the offset voltages due to serial resistances, is it still possible for the fuel gauge to report SoC=0% with <1% error if this offset is not compensated for? For instance, an average current of 710mA (GSM waveform 0.210A, 2.21A, 25% duty cycle) and a 0.01ohm serial resistance will result in a drop of 7.1mV, which I imagine would cause the discharge to terminate at 3007mV rather than 3000mV.  I realize this is quite a small drop, so perhaps it will not contribute too much (in this case) to the error, but I wonder what could happen in a 'worse' circumstance.

    3. I have seen RSOC Error vs. RSOC graphs numerous times in TI documents and I have yet to understand what actually is being plotted -where the data is coming from.  Some of the documents I speak of:
      • SLUA445 page 7 figure 6
      • SLUA450 page 9 figure 7
      • "2011 Dallas Deep Dive - Day 1 - Gauging Algorithm Comparisons" slide 52
      • "2011 Dallas Deep Dive - Day 2 - Selecting the Right Gas Gauge for 1s & 2s Applications" slide 26-27
      In SLUA445 it reads, "the RSOC accuracy for this particular test is calculated from the data log file."  What information from the data log file is being pulled-off to use in the calculation for RSOC Error?  I see the equations on page 2:
      RSOC Error = RSOCcalculated - RSOCreported
      RSOCcalculated = (FCC - Qstart - PassedQ) x100 / FCC
      I gather RSOCreported is simply the SOC in the data log file from the bq evaluation software -that's fine. But where is Qstart and PassedQ being taken from?  Also what is the equation for RSOCreported that the fuel gauge solves internally?  If possible could you provide me with a sample calculation?

    Once again, thank you for your continued aid, I truly appreciate it.

    Daniel