This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ34110: LEN Output, Temperature Monitor, BAT pin

Part Number: BQ34110


Hello Team,

I am looking at using the BQ34110 to gauge a 12V SLA battery that will be float charged to 13.6V for the majority of its life. This battery system will rarely be drained.

I would like to get clarification on the LEN output and how that interacts with the system. From my understanding, the LEN pin should be connected to a TTL mosfet with a resistor that will drain the battery about 1% periodically to help with the EOS calculation. However, in my application the battery will be float charged to 13.6V. Will this be a problem?

As for the temperature monitor, how precise does the NTC thermistor need to be? I am asking because I want to put the thermistor on my PCB board which is close to the battery. Would this have enough accuracy for the calculation, or does the thermistor need to be applied directly to the battery?

Lastly, the datasheet says the BAT pin needs to be between -0.3 and 5.5V. I used a voltage divider (620K and 150K ohms) to reduce the voltage down to about 3V from the 13.6V charged battery. The voltage will stay between 2-4V when the battery is drained and then charged up again. I picked 3V for the voltage divider because it was close to the middle of the tolerance for the pin. However, I am wondering if there is a more suited voltage I should divide the 13.6V rail to or if 3V if sufficient. As a follow up question, I assume that the Battery Monitor will give us back the 3V voltage value via I2C and then we do our own computations to compute what the actual voltage is? Is this correct?

Thank you for the support!

Jared

  • Hi Jared,

    The LEN pin may be configured to implement the Learn Discharge Phase if EOS determination will be used (and it should be for this use case). In this case, LEN should not be mapped as a direct charger pin. Periodically, the device needs to learn, and with Discharge-before-charge mode enabled, a cell kept fully charged is discharged 1% via a FET and resistors, driven by the LEN pin. For best results with EOS, please disable JEITA and Whr charge termination.

    You're absolutely correct - float charging must be disabled prior to the gauge learning, and reenabled thereafter. Assuming LEN is being utilized (LENCTL=1), ALERT1, ALERT2 and/or VEN may be configured as direct charger control pins. These pins may be configured with charging levels, allowing the pins to indicate what charging voltage should be used (via mapping). For more information on how to configure these pins, please refer to Section 2.9.2 of the TRM and read through Table 2-25.

    If a host MCU will be utilized in this circuit, the Learn Discharge Phase may be controlled by the HOST MCU instead of as described above. For the Host MCU to know when the learning for EOS will take place, with LENCTL=0 and LCTLEDGE =1 to trigger alerts when LDSG changes state (device entering/exiting EOS learning cycle). This Alert may be mapped to ALERT1 or ALERT2 pins and used to interrupt the host MCU. The host MCU may then disable charging if only one alert is mapped to that ALERT1/2 pin, or interrogate the device by reading the FLAGS register to determine which flags were set.

    As for temperature monitoring, it is recommended to place the thermistor on the cell. The gauge does also have internal temperature measurement, but either internal or external should be used. Both cannot. With respect to the external thermistor, if the Semitec 103AT or similar device is used, great news, the temperature model is included in the device. If another NTC thermistor is used, the temperature model may need to be updated.

    Asserting the VEN_EN pin disables the lower leg of the internal divider on the BAT pin and necessitates the use an external voltage divider. The external voltage divider, when configured, allows the end user to scale to pack voltages of their choice, instead of being limited to 5.5 V (1S for LiIon, 3S for NiMH). When this is done, you have a direct path to the ADC on the BAT pin (after the external voltage divider). When VEN_EN is asserted, the BAT pin is now between -0.3 and VREG25 + 0.3 V min/max, with an operational range between 0-1 V. It is recommended to set your max voltage to be ~900 mV on the output of the voltage divider and to keep your bottom leg of your voltage divider between 15k and 25k. 300k for the top and 16.5k for the bottom is utilized on the EVM. To improve the range for your solution, you may wish to select 232k resistor for the top of the divider and a 16.5k resistor for the bottom of the divider (sized with enough headroom to allow up to 15 V at 1V, and 13.55 V at 900 mV). Please do not exceed 1 V on BAT in this mode as error will increase non-linearly from 1 to 1.2 V, at which point the device will saturate until you exceed 2.8 V where device death will occur.

    With respect to I2C communications, the bq34110 device implements a Compensated End of Discharge Voltage (CEDV) gauging algorithm, which accurately predicts the battery capacity and other operational characteristics of a rechargeable cell or pack. It can be interrogated by a system processor through the I2C communication interface to provide cell information, such as time-to-empty (TTE), state-of-charge (SOC), and state-of-health (SOH). Changes in flag values associated with configurable thresholds or settings within the algorithm can also be provided to the host as an interrupt using one of the two ALERT pins on this device. The data from the gas gauge function is provided in units of mAh, and the device is capable of gauging a maximum capacity of 32 Ah. However, capacities in excess of 32 Ah are possible by using scaling techniques.

    Sincerely,
    Bryan Kahler