We are using the BQ7692003 battery monitor chips in our smart battery design. We've built a rig that tests if chip voltage and current protections on every PCB trigger respecting the nominal protection thresholds and delays plus the tolerances listed in the battery monitor datasheet. Over-Voltage (OV), Under-Voltage (UV) and Short-Circuit-in-Discharge (SCD) protections seem to behave as expected, but trigger timings of the Over-Current-in-Discharge (OCD) protection are sometimes longer than expected and result in sporadic test failures.
We've run a series of experiments and here's what we found. For all of the following, the BQ76920 is configured as such: 8mV OCD protection threshold (with 1mOhm shunt resistor, i.e. 8A), 8ms OCD protection delay. The datasheet suggests a +/-20% tolerance on all delay options, so with the the lowest OCD delay setting of nominal 8ms we expect the OCD protection to trigger within 6.4 and 9.6ms after applying a current above the 8A threshold (+ tolerances), or between 6.5 and 9.7ms accounting for FET and FET driver delays.
Applying a discharge current >11.3A for 9.7ms results in a OCD trigger success rate between 70% and 80%
Applying >16A for 9.7ms results in a ~80% success rate
Applying >11.3A for 11ms results in a ~90% success rate
Applying >11.3A for 20ms results in a ~100% success rate (test couldn't be finished so this number is only implied)
Discharge current amplitude and timing was validated with a precision Hall-sensor current probe.
Please help me understand the observations listed above. Did we misinterpret the tolerance range listed in the datasheet? Are there any hidden factors we didn't account for in our analysis.
Thank you in advance and kind regards,