Hello
I am facing the following issue:
My circuit uses a 47Ohm series resistor between IN-pin and VS-pin to supply the VS-pin of the IC and the bypass capacitor C1. (c.f. resistor R1 in fig. 27)
When the power supply (supplying the IN-pin) is enabled (fast voltage rise), the series resistor initially experiences the full supply voltage of 51V, which causes losses of (51V)^2/47Ohm=55.3W- This damages the resistor until it fails to an open-circuit.
I could solve this issue by using a pulse-rated resistor, but I would prefer to keep it a generic 0603 resistor with a 0.1W power rating.
Since the gate driver apparently only requires up to 288uA (table 6.6), I am thinking that the resistance value for R1 can be as high as R=(30V-5V)/(288uA)=86.8k, where 30V is my minimum supply voltage and 5V is the minimum required supply voltage at the VS-pin.
Furthermore, if the resistor is selected such that R>(51V)^2/(0.1W)=26k, it should be impossible to damage the resistor based on the maximum 51V supply voltage.
Therefore, I would assume that a resistor of e.g. 47kOhm should work nicely.
However, the datasheet specifies a typical resistor of 100Ohm, which is orders of magnitude lower.
Do you see any issues with using a resistor in the range of 47k for R1?
What is the recommended way for avoiding over-stressing the resistor R1 when the input voltage to the circuit is enabled?