This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TPS543B20: SERDES 0.85V Voltage Issue

Part Number: TPS543B20

Hello Team,

I've incorporated the TPS543B20RVFT into my design to supply power to the FPGA. Ideally, the output voltage from the IC should be 0.85V. However, I'm finding it to be 0.9V initially, and after programming the FPGA, it rises to 1V. What could be causing this deviation?

Thanks,

Shifali  N

  •  

    It's hard to know what the issue might be without a schematic to look at.  Can you share the schematic?

    The Output voltage for the TPS543B20 is set by the resistor on the VSEL pin and the resistors from Vout to RSP and RSP to RSN.

    If the output voltage is increasing once the FPGA is turned on, I would look for a place in that path where the FPGA can sink current from RSP to increase the Vout to RSP voltage drop.

  •  

    I am not seeing anything in the schematic or layout that I would expect to cause the output voltage to come up to 0.9V or further rise to 1.0V

    It's generally not recommended to route the differenial remote sense pair under the switching node, even on layer 4, but that is unlikely the source of the issue, though the brd file appears to be showing a warning where RSP connects to the via on layer Sig 2.

    Would you be able to share the output voltage waveforms at 1μs/division and 100μs/division?

    Can you measure the voltage from RSP to RSN?   You can measure it across the top-side vias just left of Pins 1 and 2.

    IF RSP - RSN is 0.85V and the output is 0.9V, check R521 and R518 to make sure they are 10Ω and not 10kΩ by mistake.

  • Hi Peter,

    I've captured voltage waveforms at 1 and 100us/division, focusing on probe point C231.

    Due to the micro vias at RSP and RSN, direct probing wasn't feasible. Instead, I measured across C1216. The multimeter recorded 0.852V, while the CRO showed 920mV. Both R521 and R518 have values of 10ohm. Do you think tweaking some components in the schematics could resolve this discrepancy?

    Thanks,

    Shifali

  •  

    If the multimeter is reporting 0.852V and the Oscilloscope is reporting 0.920V then most likely the error is in the oscilloscope measurement, likely the voltage between the ground at the measurement point and the ground at the oscilloscope.

    Oscilloscope's input grounds are all shorted together, and they are connected to Earth Ground inside the scope.  If there is a voltage difference between any of the ground clips on the oscilloscope probes and the Earth ground of the oscilloscope, the difference will appear across the ground wire between the Scope and the probe.  When using a 10:1 attenuation probe, the scope will then gain-up that error by a factor of 10.

    7mV difference between the board being measured and the Scope's input ground will appear as a 70mV difference in the reported voltage at the Oscilloscope.  Try using the multi-meter to measure the voltage between the ground at the capacitor and the ground ring of the BNC connector and see if you measure 7mV difference.

    If there are any other probes from the oscilloscope with ground clips connected to the board, remove them and repeat that measurement to make sure they are not the source of the error.  If you still see a voltage difference between the PCB ground and the oscilloscope it's likely a problem of a difference between the Scope's earth ground and the board's ground.