This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BQ78350-R1: BAT input range

Part Number: BQ78350-R1

Hello;

I note the datasheet lists Abs. Max on BAT of 1.0V  What is the recommended nominal or minimum voltage I should scale to from a 12Vnom, 15.5V cell stack? What is the 78350's ADC resolution? Since the 78350 uses this voltage for its gauging algorithm, I assume BAT should not be below some minimum value?

Thanks

  • Hi Jeffrey,

    Your scaling should be done such that the max is less than 1. So from 0-1 is the range.

    Pls see the TRM and app note below for additional details.

    http://www.ti.com/lit/an/slua726a/slua726a.pdf

    thanks

    Onyx

  • I understand BAT must be below 1.0V...that's why I said "I note the datasheet lists Abs. Max on BAT of 1.0V"

    My questions again are, 

    1. Since the 78350 uses BAT for its gauging algorithm, I assume BAT should not be below some minimum value to achieve accuracy. What does TI recommend  this minimum voltage should be? Example: Vin(min) = 7V Vin(max) = 16V  So the divider must be at least 1/16...So at Vin = 7V, BAT = ~4.8V.  But what if I set the divider to 1/32?  THat would set BAT at 0.5max and ~0.22V....would 0.22Vmin cause any problems with the gauging accuracy?

    2. What is the 78350's ADC resolution that measures BAT? 

  • Hi Jeffrey,

    By default, the BQ78350-R1 uses the minimum individual cell voltage for EDV detection. There is a bit that can set the device to use the average cell voltage (calculated from the BAT pin), but that is not the default setting. The gauging accuracy will always be better when using individual cell voltages because you will be using the BQ769x0 voltage ADC. 

    The ADC resolution for the BAT pin is shown in Table 7.7 of the BQ78350-R1 datasheet.

    Best regards,

    Matt

  • Thanks Matt;  

      A pox upon me for misunderstanding this stuff, it's just that I'm probably misinterpreting the TRM and usage guide.

    I assumed the opposite after reading the following in SLUA924–November 2018 Using the bq78350-R1 page 10:

    "The voltage used for CEDV is a single-cell based reference, which can either use the CellVoltage1()…CellVoltage15() data or the ExtAveCellVoltage() data. To provide the best accuracy across a wide operating temperature range, use the default ExtAveCellVoltage(). Take care to reduce the errors in this external circuit by using low tolerance and low-temperature drift components. The key components in this circuit Q8, R47, and R48 are shown in Figure 7".

    So, I should disregard this statement as incorrect.(?) If so,I can safely delete this external scaling circuit, and only use the cellx values?

    Also, I don't see a Table 7 in either the TRM or Datasheet???(and I'm wearing my reading glasses ;-D)

  • Hmm... I didn't know the history of that comment in the app note, so I asked around. It appears to be coming from the fact that the BQ769x0 ADC accuracy decreases over a very wide temperature range (+/-10mV accuracy at room temp vs. +/-40mV over -40C to 85C). Over a wide temperature range, it may be that the BQ78350-R1 ADC on the BAT pin is more reliable for gauging. Thanks for pointing this out.

    The ADC spec for the BQ78350-R1 is on Page 5-6 of the datasheet (Section 7.7 Electrical Characteristics: ADC). The ADC spec for the BQ769x0 is on page 13 of the BQ769x0 datasheet. 

    Matt

  • Ok, thanks Matt...I was staring at it and just didn't see it. 16-bit and 1.22Vref ...but it's odd that BAT is limited to 1V.  This is why I asked about the recommended minimum input voltage, since the lower it is, the lower the resolution (i.e. using only half the range drops a bit, etc.). 

    SO, given the client's application will see temperature extremes, sounds like they'll need to use this voltage BAT for calibration