This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Buck-Boost and Selection of Rectifying Supply

Being an embedded rather than power systems engineer I have recently had a chance to peruse TI's excellent Web Bench Power Architect Tools.

However, there is one question I have not been able to find a suitable answer searching the web or referencing the power electronics texts I have looked at thus far which is:

Assuming ideally I am looking to purchase one off-the-shelf rectifying AC/DC power supply for a particular project's needs. The project itself consists of several components whose needs would then be met by the combination of a number of TI power products/buck-boost circuits.

An example project outline might be like something below:

* Motor 1 - 24V - 75VDC (typical current 1 - 4 Amp, 10 Amp Max)
* Motor 2 - 24V - 75VDC (typical current 1 - 4 Amp, 10 Amp Max)
* Motor 3 - 3V (2 Amp)
* Peripheral 1 - 35VDC (3 Amps)
* Peripheral 2 - 35VDC, (3 Amps)
* Electronics - 5VDC, (500 mA)
My inquiry then is if anyone could offer any advice or point to the relevant theory/application of selecting an appropriate output voltage for the rectifying supply ? Obviously of course the main thing is that the circuit current requirements are above marginally satisfied by the supply. Further different output voltages will require different size/cost equivalent buck-boost circuits with varying layout size and BOM. In addition, certain components have a much more  narrower, strict max-min VSS band requirements than others.
But other than the circuit component size complexity and cost what is the 'right' way to think about this problem for selecting a supply output voltage. Buck ? Boost ? Both ? Does it matter ?
I appreciate any assistance/thoughts you may be able to provide.
Best,
- Anthony
  • Hi Anthony,

    Not sure that I fully understand your question, but I will attempt an answer.

    The more voltage translation your switched mode converter is doing, the poorer its efficiency will be. Meaning that a Buck converter will be more efficient when converting 70V to 65V than it would be when converting 70V to 20V. If your objective is to minimize overall power loss, you should ensure the DC/DC converters supplying your highest power outputs operate with minimum voltage translation.

    A converter with an isolating transformer will tend to have relatively poor efficiency compared to a non-isolated DC/DC converter, but on the other hand the transformer allows it to do fixed voltage translation more efficiently.

    For best overall efficiency, I would suggest that your isolated stage should produce around 75V (Your highest voltage requirement and main power consumer) and this should be down converted using non-isolated Buck DC/DC stages for all your other outputs.

    Clearly this is a theoretical ideal. There are many other considerations such as; safety, need for isolation between power outputs, available parts. In reality you will end up with some compromise.

    An even better approach, if this is an option, would be to use an isolated converter with multiple transformer taps to supply the reduced voltage levels required. For example your isolated converter could produce 75V, 35V and say 6V using multiple taps. In this case you need to think about the cross-regulation conditions. In other words if the 75V is fully loaded (20A) and the 35V is unloaded how much will the 75V output fall and the 35V output rise (your regulation loop has only one demand signal so it must look either at the 75V or the 35V or an average of the two). If cross regulation is an issue that you may need to produce slightly higher voltage taps and then down-convert the outputs using a DC/DC converter. This will still provide better efficiency because most of the voltage translation is being done in the transformer.

    Hope this helps?

    Joe Leisten

  • Dear Joe,

    Thank you for your reply and I do think you get the gist of my question. My experience has primarily been in 'embedded' design, so am familiar with putting together simple 'step down' solutions with TI ICs (i.e. 5V to 3.3v, 1.8v), but all with small voltage changes and minimal amperage.

    For my present project I have been asked to take over not only the embedded portion but also motor and peripheral control power supplies as well so I want to make sure I understand all the issues as best as I can before providing recommendations as to how to proceed.

    A further example in this vein-- for embedded one might simply have a common (layer) but divided power plane and shared ground on the PCB (i.e. for 5V, 3.3V, 1.8V) domains... But now I am starting to worry/wonder if all sorts of strange EMI (or worse effects) might occur if you have say a 30 V 2 Amp plane sitting next to 5V 500 mA (?) and whether the different power conversion pieces ought to be done on their own separate breakout boards rather than on just a single PCB as they would prefer ?

    One thing that has surprised me about this topic is that I seem to have been able to find very little information via web search on this topic (I.E. Proper PSU selection and the design considerations for powering multiple components with varying voltage/current requirements all within a single system). Further, of the textbooks I have examined I've mostly seen the theory and design discussed surrounding a particular 'kind' of circuit (i.e. design of a buck converter), but again less of this 'holistic' approach as to how do I decide my input, whether to boost/buck, what effect a power draw/drop on one part of a circuit by one component will suddenly have on other components ?

    I am not sure if you are aware of or could think of a textbook or other such resource that addresses these broader 'system level' design questions/considerations that you could suggest, but this would be very useful to me.

    Sincerely,

    - Anthony

  • Hi Anthony,
    Unfortunately I am not aware of any textbook that covers the topics you require. Most of the available texts look in detail at the analysis, design and control of particular power converter topologies.
    With multiple output power supplies the normal approach used is to source each rail from a separate, isolated, transformer winding. The grounds for these separate, isolated, supplies are only connected together only at the point power is consumed. This ensures that ground voltage drops within the power supply do not disturb the delivered voltage levels. This approach can be used and is effective even when all power rails are generated by the same transformer and on the same PCB.
    There are many regulatory requirements for an off-line power supply. The line supply suffers from dips, drop-outs, surges, lightning strikes, etc. Your off-line power stage must be designed to cope with all of these things safely. Radio Frequency injection into the line supply (EMI) is also restricted in most applications. EN standards are available covering most of these issues detailing how the supply should be tested for compliance.
    I hope this helps.
    Joe Leisten