This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ27520-G4: Fuel Gauge not detecting current properly

Part Number: BQ27520-G4

Hi,

I have board designed that uses a bq27520 as the fuel gauge.  I have run through chemistry ID, learning cycles, calibration, and golden firmware generation.

I am running into issues when trying to measure the current being used on the device.  I first program a firmware onto the main MCU on this board to set it up to shutdown all peripherals, enable the bq27520, and go into it low power state.  I then connect to the bq27520 using the ev2300 programmer and TI battery management studio.  I program the fuel gauge with my "golden binary," reset, enable impedance tracking, battery insert.  I can then see my battery voltage and an approximate (%) of capacity and the measured current. At this point I want to run a calibration procedure on the new device. I follow the steps running a CC offset, Board Offset with the sense resistor manually shorted, and then I run the current calibration using a load tied to the 3v rail of my system.  I pull 150mA from the 3v rail and measure the current being drawn from the battery. I run calibrate current.

I have had successful calibration of 3 devices using this procedure.  I have failed with 11 devices.  

What I mean by failing is that when I am connected to the device after flashing the "golden binary" and before individual device calibration. I can see the resting current as measured by fuel gauge to be approximately +30 mA.  When measuring the current myself, the draw is approximately -1mA.  When I apply the load, the current may drop a few mA's, but the device never detects a discharge.

With the devices that work, after flashing the golden binary, they will immediately properly detect ~0mA.  When a load is applied, they will update properly see the negative current draw.  These devices are able to be accurately calibrated.

My question is why am I seeing so many devices have this current measurement issue? I tend to think there is something wrong with the hardware setup (bc a few of them work), but I have not yet been able to identify any issues.   The schematic is close to the same as the reference design, with a 50 mOHM sense resistor instead of a 10 mOHM like in the ref design.  Even with a maximum current of 200mA, I don't think we'd reach the maximum input differential on that input: 50mOHM*200mA = 10mV.  I believe the input maximum is +-125mV.

Thanks for the help,

Eddie