Tool/software:
HEllo,
We'd like to understand how Impedence Track will handle calendar aging.
Our product may be used sporadically, and the battery stored in poor conditions in between uses. We use the fuel gauges's 'Available Energy' value for our system level logic, and it is important that this is accurate enough, say within 10% even after a long period without cycling
If a battery is stored, lets say for a year having been fully charged before storage. There will be an SOC drop due to self-discharge and BMS power consumption, so before use it is topped up to full charge. And, due to being stored at high SOC and perhaps high temperature, the capacity may have degraded by 10-15% or more, if one or more cells degrade faster
How accurate will the Available Energy estimation be during the first discharge cycle after storage?
And what if that discharge cycle is partial, say only taking the battery to 30% SOC. Would that prevent a good enough Ra table/qMax update to get below ~10% accuracy on Available Energy?
Thanks in advance!