I'm considering FETs with a large enough gate capacitance and a high enough switching frequency where the switching losses of FETs are important (greater than the Rds_on losses at lower current values).
I've tried to model that expected switching losses of the FET uses a model where the gate driver is assumed to have a certain output resistance.
The DRV8301 provides max source and sink current values for the gate driver, but no any resistance values. According to the datasheet:
V_GX_NORM : Min 9.5V, Max 11.5V
Maximum source current (peak Vgs of FET equals to 2 V) : 1.7 A
Maximum sink current (peak Vgs of FET equals to 8 V) : 2.3 A
I've tried approximating the source and sink resistance values using the formulas:
Rdriver_source = (V_GX_NORM-2V) / 1.7Amp = (10.5V-2V)/`1.7Amp = 5.0Ohm
Rdriver_sink = 8V/2.3Amp = 3.48 Ohm
Are these reasonable approximations for gate driver's output resistances or should I use something else?