Part Number: TPS546D24S
Tool/software:
Hello TI Team,
I am using the TPS546D24S in a single-phase, single-rail configuration with 12 V input and 0–15 A output.
I am reading telemetry values through PMBus (READ_VOUT, READ_IOUT, READ_TEMPERATURE_1).
At zero load, the device consistently reports –1.6 A to –2.0 A on READ_IOUT.
Once I perform an IOUT_CAL_OFFSET (39h) adjustment and store it to NVM, the readings become accurate and stable.
However, my question is:
-
Why does the default offset show such a large negative current (–1.6 A to –2 A) before calibration?
-
Is this expected or normal behavior for the TPS546D24S?
-
Does this offset vary part-to-part, and should it always be calibrated during production?
-
Is there any hardware or layout-related factor (e.g. inductor, sense routing) that could increase this zero-load offset?
According to the datasheet, READ_IOUT accuracy is typically ±2.5 A to ±4 A depending on load and temperature (p. 68–69), but I’d like to confirm if this zero-load bias is part of the normal device characteristic or if it indicates a setup issue on my board.
Thank you for your help and clarification!
Best regards,
Salman
