This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ2057T: Understanding application and safety for 2S Lithium Charging

Part Number: BQ2057T
Other Parts Discussed in Thread: BQ24133, BQ2057

Hello everyone,

I'm working on a product that includes a Li-Ion battery pack-- and I'm trying to understand the best way to charge it safely, while still being able to use the device.

First I'll explain the products functional portions so you can better understand the application, then we'll move on to how it relates to the BQ2057T

_____________________________________________________________________________________

The product can be split into 3 main functional blocks: A wall plug charger, the battery, and the load itself. 

Battery:

The battery, as mentioned, is a 2S4P battery pack with an included battery management circuit (for overvoltage and short protections, typically included by battery pack assembly houses). It is to be charged to 8.2V (could be 8.4, but we're aiming for 8.2 to extend the life of the battery over cycles), with a maximum charge current of 1.7A. 

8.2V Max Voltage

1.7A Charge Current

8500 mAH Capacity

2S4P Configuration

 


Load:

Our load here is a heating element and its control circuit, involving a microcontroller, LCD screen, and some sensors. 

The control circuit is given a clean 5V through a small linear voltage regulator (LD1117S50CTR). The control circuit consumes 200mA. 

When the control circuit switches the heating element on, it does so by simply activating a small electromechanical relay. The heating element consumes up to 5A  at first, but settles to 3.5A nominal within 30 seconds, and eventually decays to 2.9A (it's a thermoelectric module, hence the ever-changing current draw). The heating element is tied directly to the battery voltage, without going through any regulator. 

Consumption: 200mA Standby, 5A ON spike, 3.5A nominal ON.

Duration: Until battery death usually

Wall Charger

This part's specs have not been chosen yet, because they depend on the nature of the Li-Ion charging IC. I'm imagining a standard AC to DC wall plug that ends in a DC barrel jack. Its max amperage and nominal voltage is something I need your help deciding on. 

 

 

_____________________________________________________________________________________

The goal here is to charge the battery safely, but also be able to use the device while the battery is charging. While charging, the Load should be sourcing its power from the wall plug directly, not through the charging battery.

At the moment, I charge the batteries in the prototypes by detaching them from the device, and attaching them to a benchtop DC power supply, set at 8.2V with a max amperage output of 1.7A. This results in a CC charge until 8.2V is reached, then a CV charge until I manually switch off the charger. This works in a pinch, but obviously not for a consumer product.

Additionally, since the Load consumes 3.5A nominally, and the battery should not exceed 1.7A input while charging, I cannot use a wall plug that is simply limited to 8.2V and 1.7A (mimicking my DC power supply). Also, that last CV portion of charging isn't healthy for the battery unless it is cut off when the current flow is low enough, which a consumer naturally wouldn't do if charging the device overnight.

The obvious answer is a Li-Ion charging IC, like the BQ2057T.

However, I need help selecting the appropriate external components to make this chip work the way I want it to. 

I have looked at the the Microchip MCP73844-8.2 IC, and found this BQ2057T as I was looking for a TI equivalent to that Microchip component.

My background is mechanics, not electronics, and while I built the microcontroller's circuit myself, I want to be extra-careful when dealing with charging lithium, which is why I'm turning to you all. 

I've read the spec sheet. Figure 4 describes an application that is pretty similar to what I need. 

If any of my questions are obvious, I apologize-- I'm still learning more about electronics.

I need help configuring and selecting components to build a 1.7A charger.

How do I select Rsns? I see equation (2) describes a simple ohm's law statement, but I only know Io=1.7A. What is Vsns meant to be?  Is it held constant by the chip itself? I see a range of values presented in the table on the specsheet. Am I meant to select a Rsns such that Vsns falls within those values? 

Low side sensing or high side sensing? I see that Rsns can be configured in two ways. Which way is best given this application? Pros and Cons?

Can I disable or bypass temperature sensing? Part of why we chose 1.7A as the battery's charging limit is because our use case does not require fast charging, so we could avoid having to worry about the temperature of the battery. The Microchip component I mentioned previously allows for the temperature sensor to be disabled by setting up a specific voltage divider, is there a similar technique for this chip?

What is Pack(-) connected to? In the diagram, I'm seeing two connections for PACK- (the negative terminal of the battery). One goes to ground, but the other goes through what looks like the symbol for a variable resistor, through TEMP, but then only has to go through Rt1 and shorts back to Vcc? Am I missing something? What is that variable resistor doing, and why is PACK- shorted with VCC through Rt1?

Why does DC+ Connect right to the STAT pin through R4 and an LED? Doesn't that make the LED on the right side of the diagram always turned on? What is the point of this connection, and therefore R4?

R2 serves as a pullup for CC, correct? As far as I understand, if Q1 is a P-channel MOSFET, then it is OPEN ( not conducting) when its gate is HIGH, and it is CLOSED when the gate gets pulled LOW, is that correct? This would imply that pin CC triggers charging by going LOW, right?

What does R5 do? It's just in parallel with the status LED's. Is it necessary? R3 makes sense because it works with the LED, but what's R5 doing?

What should my DC+ Voltage be? Since theres a diode D1 and a MOSFET Q1, I'm imagining that there's a pretty fair voltage drop by the time I get to BATTERY+. The IC itself regulates pin CC with feedback from VBat to ensure that the battery voltage is held at 8.2V. What should I be feeding this system with? 12V? 

How can I isolate the battery from my load while charging? The behavior I want is that IF the wall plug is plugged into the device, THEN no connection should exist between the battery and the load. I was thinking of using DC+ to trigger a two-position relay, switching the Load's Vin from the battery (connected to the normally closed pin of the relay) to the charger itself (so DC+ is connected to the normally open pin on the relay). Would that cause any problems with the charging circuit? How would you accomplish this task?

How do I select Q1? What do I need to consider given my battery input? Given my specs, is there a particular mosfet you would use?

I really appreciate the help, just want to make sure I'm going about this application safely.

  • Hi Uzair,

    To handle this level of current, a switch mode charger is usually used.
    Take a look at bq24133. It has direct VBUS to SYSTEM path so you enjoy the switch-less high efficiency at 3-5A current. Pick FETs on the input side (reverse blocking and OVP FETs) according to your application voltage and current ratings.
    This charger handles your system load current and battery charging current separately with the power path (BATFET). You can set the input current and charging current regulation points with external resistors.

    Thanks,
    Will
  • Will Zhou said:


    To handle this level of current, a switch mode charger is usually used.
    Take a look at bq24133. It has direct VBUS to SYSTEM path so you enjoy the switch-less high efficiency at 3-5A current.

    Thanks for the suggestion, but at 1.7A charge current, I'm still within spec of the BQ2057T I originally posted about, as it lists a max of 2A. The load itself draws 3.5A nominally, but those 3.5A shouldn't have to ever come from the charging circuit-- rather, I would use a signal from the charging circuit to create a parallel pathway to the load, directly from the wall, and bypassing the battery entirely, keeping it isolated while charging. Thus, the charging circuit only will ever see 1.7A maximum under any circumstances. So, is switch mode really needed?

    The application circuit for the BQ24133 you suggested appears significantly more complex than the BQ2057T, this is the main reason I am worried about the part you suggested. Are the additional features necessary for the application I described? Especially given that I am still underneath the max amperage of the original chip. 

    I'm not too concerned about energy efficiency. After all, the energy source for charging will be coming from the wall outlet, and I have no reason to minimize wasted energy (from the wall) in this application. Cost is a primary concern, as this is going into a mass-produced consumer product. PCB layout simplicity and component costs are key points for me. 

  • Uzair,

    The charger's efficiency affects the heat dissipation. bq2057 is a linear charger. You can think of it like an LDO. Heat loss is determined by the difference between the input and output voltages and the charging current. Power dissipated is roughly equal to (Vin-Vbat)*Current. This can be significant if you have a high Vin and charger the battery when it is at a low voltage. If you are okay with the heat dissipation, and would like to make the bypass power path with external circuit (protection FETs and current and voltage limit control), bq2057 may work. 

    bq24133 provides the control of both charging and system power, so you only need to put in the input protection FETs. 

    Thanks,

    Will