Other Parts Discussed in Thread: BQ40Z50
There is an application that uses bq40Z50.
When I contacted TI about the delay time of OCC and OCD about the application, "The delay time of the actual operation is the delay time set in the gauge IC + 1 to 2 seconds.
The reason is that OCD and OCC sample the current four times (at intervals of 250 msec) and calculate the average value. Therefore, sampling time + calculation time is required. Was said.
1. 1. Is this true?
Why is there a large range of 1 to 2 seconds?
I don't think the sampling time and calculation time will fluctuate so much.
2. 2. Is the delay time measurement done by counting the clock?
Do you have this much time fluctuation range because you are using a CR timer?