This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DRV8301 gate drive charge pump regulator efficiency? Other options?

Other Parts Discussed in Thread: DRV8301

Hi,

I recently began looking into the DRV8301 for a 60V 70A brushless motor driver. One of the main design goals is to keep the efficiency very high at low output power, so when I first saw the device I figured it would use the buck regulator to reduce gate drive power usage. But that doesn't seem to be the way it's used.


So, two related questions:

1) How efficient is the charge pump regulator used for the gate drive?

2) And, if it doesn't compare well to the buck regulator efficiency, could I set the buck regulator to 12V?  15V? and use it to drive the gates, either directly via the GVDD pin or indirectly through PVDD1 and the charge pump regulator? (The goal with the second option, of course, is to get much better efficiency from the charge pump with a lower PVDD1, so if it doesn't that idea is moot.) Obviously the PVDD1 pin will be fine at 12 or 15 volts instead of 60, but is there a problem with the boost voltage going up to 70 volts or so if PVDD1 is at 12V? And, if the charge pump regulator is backdriven at 15 volts, will it shut down politely and let the external regulator take over, or complain a lot?


One more secondary question - is my battery voltage too high to use safely with the DRV8301 and its buck regulator? Fully charged, the battery voltage could reach 62V. It sounds like I would at least need to put in more anti-surge protection, compared to a gate driver rated for 100V or more, but I am not sure which measures would be most appropriate.


Thanks!

  • Hi Jason,

    1) I believe the efficiency gains you would get trying to repurpose the buck regulator as the gate drive supply would be quite negligible. A charge pump is in essence a DC to DC converter already. I am unsure if we have any exact numbers on hand for this.

    2) The GVDD pin has an overvoltage warning but can be damaged if it's limit (13.2V) is exceeded. By default we do not recommend driving GVDD externally.

    3 This would be extremely close to the device limits and 62V would already be exceeding the recommended operating conditions. This does not take into account back driving from the motor which will pump up the voltage and inductive flyback which will spike the outputs. You would have to take alot of caution to prevent these events and also ensure the battery does not fully charge.

  • Hi Nicholas,

    Thanks for the quick reply.


    Correct me if I'm wrong, but (after reading up on switched-capacitor regulators) the single flying capacitor on the DRV8301 seems to indicate a maximum current ratio of 2:1, and thus an efficiency approaching 2 times that of a linear regulator.

    So, for our planned MOSFETs with ~60nC total gate charge each at 10V, an initial PWM frequency target of 50kHz, and peak battery voltage of 60V (not 62), I get the following minimum values for gate drive power:

    Per MOSFET:

    10V at the gate: 60E-9 nC * 50000 Hz * 10V = 30 mW  (3 mA average current per gate)

    60V with linear regulator: 180 mW

    60V with switched capacitor regulator: 90 mW

    60V with 85% efficient buck regulator: 35 mW

    FOC control, 6 gates switched:                                    6-step control, 2 gates switched:

    Linear regulator:             1080 mW                                     360 mW

    Switched capacitor: >=  540 mW                                       180 mW

    Buck regulator:               210 mW                                         70 mW

    If all the gates are switched it looks quite high for our power budget, would heat up the DRV8301 quite a bit, and wouldn't allow going to a PWM of 100 kHz if we needed to (due to the current and/or thermal limits of the DRV8301). For the 6-step control option, the added efficiency of the buck regulator is not so significant, and certainly the switched capacitor arrangement is an improvement over a straight linear regulator. But, we don't know yet how well 6-step commutation and a low PWM frequency will work; we might end up needing FOC and 100 kHz.

    I realize that putting 12V into the GVDD pin is not an approved use of it, though I was curious about whether it would work, but what was the answer regarding use of a pre-regulated 12V gate drive voltage into the PVDD1 pin, but with 60V on PVDD2, the MOSFETs, the boost capacitor, etc.?

    thanks,

    Jason

  • Hi Jason,

    Even in your worst case scenario (540mW) you are looking at a junction increase of ~15C from ambient (theta JA is 30C/W for this package). This is not an issue in most applications. There will be some other heat sources on the device but since it is a pre-driver these will be minimal. Theta JA is a very "optimistic" rating and most PCBs perform worse than the JEDEC test board, but it gives us an idea of the ballpark we are in.

    Generally the extra cost of utilizing a buck converter to supply the gate drive would rule it out compared to a charge pump, even if you are gaining some efficiency.

    In most systems we see designers focused on optimizing the switching half-bridges themselves as most of the system efficiency is lost there. A 4.2kW inverter will dominate your system's efficiency ratings.

  • In response in your last question...

    The gate driver and buck regulator are seperate devices inside of the IC. It is a multi chip module. PVDD1 supplies the gate drive and PVDD2 supplies the buck regulator.

    PVDD1 and PVDD2 can be seperate voltage supplies but PVDD1 should be the same supply as the power MOSFETs. There is VDS overcurrent detection that is implemented through PVDD and SH_X. There are also several paths that can backfeed through the device if SH_X (switch node) is greater than PVDD1.