I am using the bq77pl900 chip to manage a high-capacity, 8-cell lithium-polymer battery pack in host-control mode. I've managed to get everything working pretty much as I want it to, with the exception of the cell-V measurement calibration. I'm starting to think I must be doing something fundamentally wrong somewhere, because by far my greatest source of variability in my cell voltage measurements, is coming from the chip calibration process itself.
The process of cell-V calibration involves presenting 6 calibration signals on the VOUT pin, and performing calculations using the voltage measurements obtained, to generate two calibration constants which are use to scale and offset the cell-voltage measurements.
On my hardware, VOUT is connected to a 14-bit ADC chip. With a 1.25 reference voltage and 14 bits of resolution, each ADC count represents about 76 microVolts (1250000 mV reference, divided by 16384 counts in 14 bits).
The following are typical ADC counts I see at VOUT, for the 6 calibration measurements :
ADC counts
Step 1: 0
Step 2: 15708
Step 3: 32
Step 4: 69
Step 5: 12611
Step 6: 9161
The values calculated from this data are as follows :
1. Vdout_0V = 0.00000
2. VREF_m = 1.19965
3. Vdout_VREF_m = 0.00243
Kdact = -0.00203
4. Vdout_2V5 = 0.00527
Vref_2V5 = 2.59603
5. Vout_1V2 = 0.96312
6. Vout_2V5 = 0.69962
Kact = 0.18871
Vout_0V = 1.18951
The first number at each step is the direct data measurement, in Volts; other numbers (steps 3, 4, and 6) are derived values, as decribed in the datasheet procedure. The final two numbers are the results, used to scale and offset cell-V measurements. Names reflect the names as used in the datasheet.
My problem is that I am finding that a difference of only 1 ADC count (on the 3rd of the 6 calibration values measured) can result in a cell-V measurement difference of over 60 milliVolts ! That's less than 76 microVolts difference on that signal, which certainly seems to me to be "in the noise".
This means that I'll be doing cell-V readings (on an unchanging test signal), and getting numbers within 2 or 3 milliVolts of each other, which is about what I'm aiming for - all good. Then, a calibration occurs, and after this, my cell-V results all jump up or down, by up to 60 milliVolts. The signal is the same, but my calibration constants have been changed enough to completely skew the resulting voltage.
There's a lot more detail I can supply if it will help - code that's doing these calculations; the settling/sampling/sorting/averaging that I'm doing on the ADC results; per-device uV measurement of the 1.25V reference's exact value; etc.
I was previously doing a calibration every 10 minutes (when actively charging or discharging), but have now switched to limiting these calibrations, so that one does not occur during any single charge/discharge cycle. I suspect I should do a re-calibration whenever I detect any significant temperature changes, but quite frankly I have not yet seen any drift which is nearly as large as the calibration uncertainty itself introduces!
Is anyone using this calibration procedure successfully? It's not helping me much (if at all), and I'm tempted to just throw it all away.
Many thanks to everyone who has read this far.
regards,
Brian