Other Parts Discussed in Thread: GPCCHEM, BQ27421-G1
In our design we have 2 separated Li-Ni-Co-Mn batteries we monitor. One is 850mAh and the other is 2000mAh. We have 2 settings in our SW and when user indicate which battery he is using (2000 or 850) we load up the battery values in to the device NVRAM.
The Fuel Gauge seems to provide SOC in a correct way to the 850mAh battery, but the 2000mAh battery SOC is way off. The Fuel gauge report 0% when battery is around 40% per our measurements.
For both batteries we changed the Design capacity and Termination voltage dramatically, just to see how much it changed the SOC reading. When we read from NVRAM, we can see that the values we burned are exactly as we burned them. But when we test the battery the change is minor in the SOC....it was as if the SOC is set to a specific default Battery and it reads our tested batteries in the same method no matter how we change the Design capacity and Termination voltage.
The Fuel gauge monitor the Voltage and current very accurately when compared to real measurements we do in the setups.
Once again, the 850mAh battery seems to be monitored correctly by the Fuel Gauge, but the 2000mAh is way off.
Attaching 2 files :
1. TALK TIME PCB234 Int - FG off- 2000mA - I this excel you can the 2000 battery can go down to 2.8V and then shut down, while our system shut the device down when Fuel Gauge read 5% battery. From the graph you can see the device was shut down at round half way of the battery life and not at the last 5%.
In the Excel you can see also the parameters we set to the Fuel Gauge.
Thanks