This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS4149: Input Clock relationship to Output

Part Number: ADS4149

Hello,

We are using hardware which implements a 208 MHz LVDS DDR data interface between the ADS4149IRGZT and an FPGA. We have a version of the board which was only modified by skewing phase by 80 ps on the input data clock. We now have had to adjust timing constraints in the FPGA and ADC register settings in order to get DDR data clocking into the FPGA correctly. Are there any known cases in which adjusting the delay on the input clock also changes the characteristics of the output clock? With respect to a shift of the output data clock relative to the output data, not overall system latency.

  • Hi Nelson,

    Are you asking if the ADC latency is a factor of the input clock phase? If so, then the answer is no. The latency is fixed regardless of input clock's phase. If this is not what you are asking, could you please rephrase the question?

    Thanks, Chase

  • Sure, basically we had a delay, let's call it X, on the input clock to the ADC in the old design. In the new design, we have a delay X-80 ps on the input clock line. Does this in any way impact the output data or output data clock timing relationships? I'd think not, but due to the issue we have seen we are left wondering... The FPGA interface is entirely synchronous to the output data clock, no relationship to the input data clock.

  • Hi Nelson,

    The input clock's phase (in a single device system) is viewed as arbitrary and irrelevant from the standpoint of the ADC, so this would not have any effect as you are describing.

    What is the duty cycle of the 80ps delayed input clock? Does this match the duty cycle of the original no delay clock input? Is it possible for you to measure the duty cycle of the output data clock as well and compare for the two different input clocks? Also, if you remove the delay, does the system still work as before? One final request, is it possible to provide a block diagram of the system?

    Regards, Chase

  • Hey Chase,

    OK, thanks for confirming this first point. Duty cycle of the input clock is 50%, the input clocks are from the same place so yes they match between the old and new design. Good idea on trying to make the delay the same on both, we theoretically can do this via the FPGA without hardware modification. Another complication here is some of the hardware we are working with is already boxed up and hard to probe.

    I don't believe I can provide a block diagram at this time...

    On a related vein, have there been any changes to the silicon design of this product from 2016 to 2022?

  • I am almost certain there have been no changes to the silicon since the RTM date in 2009, however I will ask others and confirm with you.

  • Hi Nelson,

    I've reached out to our design team to comment on whether there has been any silicon change. Typically, if a change does occur, all customers will receive a notice stating that there has been a change and again, I am doubtful of silicon modifications in this timeframe. Is this an issue you are seeing on a single setup or is this across multiple devices/setups? Have you been able to re-validate the original 0-delay clock input setup?

    Thanks, Chase

  • Hi Chase,

    Thanks for your help. I don't have access to hardware to test further at this time, but I will update this thread when I do get a chance to try on other devices/with the modified input clock delay.

  • Okay, I will also update whenever I receive word back from design.

    Regards, Chase