Hi,
The gauge documentation suggests that a 50ppm/C sense resistor be selected. I assume this is selected to make the resistor temperature response small compared to the other tolerances in the gauge over temperature. Please verify whether these statements are correct. Thanks.
- Mark
1. The sense resistor tempco impacts gauge accuracy at temperatures away from 25C while the gauge is coulomb counting between rested voltage measurements..
2. Let’s assume 0-50C, ±25°C around room temp (consumer product). 100ppm/C: 25*100e-6=0.25%. 500ppm/°C: 25*500e-6=1.25%. The desire to have a resistor tempco low is so that it minimizes impact to accuracy over temperature.
3. 500ppm/°C(±1.25%) might give us something to factor into error calculations, but I think 100ppm (±0.25%) probably falls into the “don’t care” range.
4. The resistor tolerance directly impacts the accuracy of the coulomb counter in the IT gauge, which directly impacts things like time-to-empty predictions during times between rested OCV measurements.
5. For bounded temperature ranges, larger tempco sense resistors might be OK and analysis like above can show what the expected impact would be to the gauge predictions.