Other Parts Discussed in Thread: BQSTUDIO
Tool/software:
Per Section 2.5, The Temperature Model informs the simulation of temperatures during the voltage simulation. In turn, these temperature projections from the model also predict the battery impedance at a given point during the discharge simulation, as modeled by R(T) = Ra * e^(Rb * T).
It has been observed on our gauge that at rest, the remcap is adjusted downwards at the same time the DTRC bit becomes asserted in the FlagsB register while the battery is at rest. This leads to a small SOC drop, which adds up over the course of a few days. Because of the DTRC bit being set when this occurs, it seems as if the voltage/discharge simulation is what causes RemCap to be adjusted down, and a decrease in SOC. I have the following questions:
1. At what C rate is the discharge simulation run at?
2. Is the temperature model T(t) learned only during the learning cycle or is updated every time a discharge occurs?
3. Is there a way to prevent these adjustments from being made in register settings, or some cycling approach to prevent this voltage simulation from adjusting downwards the RemCap?