Tool/software:
Hello! I am currently using the bq34z100-G1 with a lithium iron phosphate battery. During the official golden learning process, I first fully discharged the battery, let it rest, then sent IT_ENABLE, and charged it at a constant current and voltage of 0.5C until the cutoff point. The FC flag was set, and after letting it rest, the SOH changed from the initial 47% to 50%. After resting until the OCV was captured again, I began discharging at 0.2C. When the SOC reached about 80%, the SOH suddenly changed to 100%, but the FCC still showed 7575, which is not equal to 8000. Can I interpret this as follows: SOH is only calibrated during the discharge process. By measuring the discharged coulomb count and dividing it by ΔSOC, the FCC is calculated, and then SOH = FCC / Design Capacity. I would like to know how the two points, SOC1 and SOC2, for ΔSOC are determined? Does the FCC only display the newly calculated value after the cycle is completed? Is the SOH Load I set based on the standard 0.2C discharge during the first standard learning cycle? If the customer uses 1C-2C discharge after the learning process is completed, will this cause increasingly inaccurate or sudden changes in SOH, SOC, and FCC?