This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

bq77PL900 Vout calibration and measurement

Other Parts Discussed in Thread: BQ77PL900

When I try to calibrate the vout by changing CAL2-0 and following the step 1-6 listed in datasheet.

In each step , I get vout from a MCU A/D input.

I found it need "some delay time" to read the actually vout in each step.

I think that EV board has a 0.1uf on vout pin, which make the need of delay.

Any suggestion of the delay time for correct reading the vout?

Thanks !

  • The cell monitor output has about a 600 ohm output impedance, with the 0.1uF cap it will have about a 60us time constant.  Any filtering and the ADC input will likely add to the load & lengthen the time constant. However as shown in the datasheet, it is a sampled value.  A selected cell will be sampled about every 1.25ms.  It could take several time constants for the analog signal to settle, ~3 ms may be a reasonable delay.

    Also note the sampling for the cell monitor will run during cell balancing.  It should be turned off during balancing to avoid measuring balancing voltage rather than the cell and balancing should be turned off during cell measurement.

  • Hello WM -

    I am using this chip also, and I have a few questions (I am still struggling to get the accuracy I desire for cell V measurements).

    From what you have said above, is it fair to say that if I request the bq77pl900 to present some cell's voltage, and then wait for a short settle period, say 200 microseconds, then the signal I will be presented with on VOUT is (most likely) from a single sample of the actual cell?

    Do I need to present the signal on VOUT for at least 1.25 milliseconds, to be assured of a signal constructed from 2 samplings of the actual cell ? Or am I misunderstanding?

    Just FYI, I am feeding VOUT  to a 14-bit ADC, and I see a lot more variance in the raw ADC counts than I'd like - the equivalent of many milliVolts (sometimes 10 even 20). I was hoping to get at least milliVolt accuracy. This is measuring a highly precise fixed signal which is simulating a battery cell. It's possible I am getting noise somewhere into my ADC input, and I am still investigating that. I don't cell-balance while measuring, nor measure while balancing.

    Is it reasonable to expect milliVolt (or better) accuracy, with a good hardware design?

    cheers,

    Brian

     

  • Sorry, my mistake - 50 milliseconds is the sample period, not 1.25 milliseconds.

    Where did the 1.25 millisecond figure come from? The datasheet says about 120 microseconds between cells (withini a scan), and 50 milliseconds between scans. So I'm confused :-)

     

  • There are 2 measurement systems, the safety system which runs on the 50ms sample period and is always on, and  the cell monitor output (Vout) system which is user selectable.  The user selectable cell monitor system samples on the ~ 1.25ms interval.

    Gas gauges which use a similar architecture AFE use filtering for cell voltages.  When I connect a multimeter to Vout monitoring a cell, it shows the tenths of mV varying. Meter filtering stabilizes this more.  I expct you will need good filtering or averaging even with quiet supplies and A/D reference. 

     

  • Thanks for that, WM. Do you suggest a settle time of 1.25 mS for the VOUT signal? I'm getting decent results with only a 200 uS settle delay. I collect many samples, sort them, then throw away some of the high and low values, and average the rest. Currently I collect 30 readings, and average the 20 in the middle. More, for calibration measurements.

    We have just implemented an external, per-cell calibration scheme, which is improving our repeatability to within plus or minus a couple of mV, not too bad. We are presenting known reference voltages (using a dummy cell), in the neighbourhood of 2.5V and measure the correction needed to get the VOUT-measurement to match, then repeat with a reference near 4.1V, and store the data, to use in a linear interpolation, to try to translate what we measure at VOUT to what's actually present. Still testing but so far it looks good.

    What's surprising to me is how much correction is required. My raw measurements at VOUT vary between 4.17 and 4.25 mV, for reference values near 4.09 mV. This variance is between-cells; the 4.17 cell always measures around 4.17.

    Seems like a lot of correction to me - about 160 mV for my "worst" cell (4.25), and about 80 mV for my "best" (4.17). What do you think?

    regards,

    Brian