This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC128S102QML-SP: Is VA/VD power sequencing required during power-down?

Part Number: ADC128S102QML-SP
Other Parts Discussed in Thread: ADC128S102EVM, LM4120

Hello - I am planning to use this component in a space application and would like to better understand what the recommend design guidance is for protecting the device during power-down.

I see on that datasheet that VD must be less than VA + 300mV, up to a maximum of 6.5V.

To power the ADC I have two separate 3.3V LDO's where VD is tuned to be slightly below VA, nominally. Additionally, these LDO's are sequenced during power-on such that VA always powers up first.

During power-down the design does not currently provide for any sequencing or other control to prevent VD from exceeding VA. What is the recommended design guidance in this case?

My initial assumption was that power sequencing or some other control is not necessary during power-down because there is not an active power source to drive current into the ADC to do damage. There is only the output capacitance of the LDO's (about 15uF each).

Would the TI engineering team provide further rationale on whether the above is appropriate?

Thank you,

Alex

  • Hello Alex,

    We have an FAQ on this subject, and yes, the power-down sequence must also abide by the requirements for VD to be less than VA. https://e2e.ti.com/support/data-converters-group/data-converters/f/data-converters-forum/988726/faq-adc128s102qml-sp-power-down-sequence

    If the VD voltage is above the VA voltage by more than a diode drop (0.3V), then an internal diode-like structure will conduct and depending on what else is connected to VA this could result in large current flows which will damage the product.  Note a 15uF capacitor can easily put out transient currents much greater than the 10mA we often specify for similar internal structures that are not intended to conduct current during normal operation.

    Recommendations in order of effectiveness would be:

    • 1.)  Arrange for some supply sequencing such that the VD voltage ramps down before the VA voltage
      • This could be coordinated by controlling the LDO enable signals to disable the VD signal first, and/or
      • by adding circuitry to provide a faster discharge to GND of the VD node such that it's forced to discharge faster than VA.
    • 2.)  Provide a method to limit the possible current flow during the condition where VD > (VA+0.3V) to a current <10mA.
  • Thank you, Collin. I need to work this on my end; your option 1b is likely the best choice for us. I'd like to leave this issue open for the moment in case I have any follow-up questions in the coming days. Alex

  • Hello Alex,

    Understood, we'll leave it open.  Just reply back and I'll get notified.  If you're willing we can have a private discussion at that point as I'd be very interested in learning as much as you can share about how you're using the ADC128S102-SP in your designs.

  • Hi Collin,

    I was looking at the ADC128S102CVAL evaluation board available from TI and thought the schematic was interesting: https://www.ti.com/lit/ug/snau093/snau093.pdf

    There appears to be an option to install opamp U3 to generate a VD supply relative to VA, but the documentation shows that U3 is not installed and VD may either be connected directly to VA or driven separately by the user.

    While my current design uses LDO's rather than opamps, I think the operating principle is not dramatically different. Admittedly, I may have more capacitance than on the eval board, but I can't say for certain because C19/C21 values are not specified!

    What was the motivation behind not stuffing U3? Is it related to my question about power-down supply sequencing?

    Alex

  • Hi Alexander,

    To be honest I'm not that familiar with the decisions made on the design of this eval board since it was obsoleted a few years back from TI.  It's common for some of the documentation and boards to still be out on disty websites and we'll answer questions as best we can. 

    With that after reviewing the user's guide and schematic, I agree with not populating U3 for this build. It will nearly assuredly violate the condition of VD <= VA during power-down since the VD signal is intentionally delayed from the VA signal.

    A solution should be identified to turn on VA before VD, and turn off VD before VA.  The U3 circuitry accomplishes the first goal, but not the second.  Another reason to avoid this U3 solution is that unlike LDO's, op-amps will often glitch to the power-rail during-startup and this could also cause unexpected voltages on VA and VD.

  • Ah, I did not realize that the eval board I linked to was obsolete.

    Looking again, I found an active eval board, ADC128S102EVM (https://www.ti.com/lit/ug/snau167/snau167.pdf). It appears to be designed to be compatible with the LaunchPad Development Kit MSP430F5529LP (https://www.ti.com/lit/ug/slau533d/slau533d.pdf).

    The eval board user guide suggests that the ADC is powered by 3.3VA regulated by an LDO (LM4120) straight from the 5V input from the Launchpad USB. The 3.3VD supply comes from the LaunchPad SMPS (TPS62237) which is also powered from the same 5V.

    I am unsure whether this design respects the VD < VA + 300mV constraint during power-down either. Given that the 3.3VD output capacitance is 20x larger than the 3.3VA output capacitance, if the digital and analog supply loads are presumed to be roughly equal, then a significant voltage imbalance could result during power-down.

    Is the rationale for the validity of the LaunchPad + Eval Board design that the digital supply load will outpace the analog supply load during power down, such that an adverse voltage imbalance will not occur? If this is an argument that can be made, I'd like to understand it so I can put forward a similar rationale on my end.

  • Hi Colin,

    I'm not sure whether you received my most recent question from 11 days ago but I wanted to let you know that my current plan of action is to install a Schottky barrier diode between DVCC and AVCC (part number: Microsemi 1N6857UR-1). I may include a ferrite bead in series to attenuate any high-frequency noise which may leak through.

    Thanks, Alex

  • Hi Alex,

    It may be best to not try to reverse engineer some of the decisions made on the ADC128S102EVM as it was created before our team was supporting these products.  Relying on the digital current to pull down the DVDD node would have to be tested at a system level.  Power-supply sequencing is always the best option, however we understand this is not always possible.

    The solution you mention should be tested before production, but should work.  The diode shouldn't conduct while the AVDD is ramping up before DVDD and will discharge DVDD into the AVDD path during power-down.  Provided the AVDD path can properly sink the required DVDD currents the DVDD pin should remain within the diode drop of the external diode.  

  • Collin,

    It is disappointing to hear that there is no technical insight to be made from TI's EVM product. I appreciate your concurrence on the diode solution; I will build a development unit in order to verify this solution by test.

    Best, Alex