When using a Keithley 2308 battery charger simulator, the BQ24071 appears not to be meeting the spec. However, in practice, when using Li-ion battery the BQ24071 meets spec and there are no strange behaviors seen.
Here's what is observed using the Keithley 2308 battery charger simulator:
http://www.keithley.com/products/dcac/highspeedpower/battery/?path=2308/Documents
When Simulating a 4.2V Li-ion battery and the simulator is set for 3.5V, per the spec, the BQ24071 should be in the fast charging mode set at the max current setpoint. One would think at 3.5V, it would need to be in fast charging mode to get to 4.2V. However, the controller is not in fast charging, but in a slower charging mode. If you were to drop the simulator setpoint from 3.5V to 3.1V, fast charging mode is then enabled.
Question #1:
Does the BQ24071 require to see a certain change in voltage slope before it is enabled for fast charging (set a max current), or is it based on where battery voltage is in relation to the DPPM setpoint midpoint.? Seems to work with the simulator set to 3.1V, which is below the DPPM setpoint midpoint of 3.2V ((3.8+2.6)/2), so it might help set the fast charge.
Question #2 relates to STAT1 and STAT2 LEDs:
When the simulator is set at 2.1V, the LEDs show that it is in PRE-CHARGE MODE, as expected. However, if the simulator were to drop from a 3.5V setpoint to a 2.9V setpoint, the controller LEDs indicates that it's still in CHARGING MODE, but instead, the LEDs indicate that it's in PRE-CHARGE MODE. According to the spec, PRE-CHARGE MODE is between 2.9V and 3.1V. If the simulator was to start at 3.5V to 2.1V, then the behavior of the controller is as what is expected. Again, is there a certain discharge slope that controller is looking for to understand which mode to be in?
Question #3 relates to using a resistor to simulate a thermistor on an end of line production run.
The battery charger test is having some problems. When switching from the 10K resistor to the 3K9 resistor, the system would continue to charge. If the resistor switched to 27K it would cease to charge and when switching back to the 3K9 it did not restart charging. Switching back to the 10K would restart and back to the 3K9 would keep charging. A "bad unit" is defined as not passing the EOL test, but functionally nothing is wrong with a real battery connected. So, on a "bad unit", when the 3K9 resistor was installed, the voltage measured at pin 2 of the battery was around 0.5V. On a "good unit"voltage measured around 0.35V. To pass the EOL test, the resistor had to be adjusted back down to a 3K3.
A multi-turn pot is used for the 3K9 and had adjusted exactly at 3K9. Given this, the highest voltage that should be seen at pin 2 is 0.41V which is was below the 0.485 minimum from the data sheet. I can't explained why a "bad unit" measured around 0.5V, perhaps there is some additional resistance on the test fixture or the datasheet is in error and actually sources >106uA.
What is the proper way to simulate a thermistor to avoid issues using a resistor?
Thanks.