This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TFP410-EP: IDCK input and DVI output clock jitter questions

Part Number: TFP410-EP
Other Parts Discussed in Thread: TFP410

The TFP410-EP datasheet lists a jitter tolerance on the IDCK input of 2 ns (typical), and a DVI output clock jitter max of 150-190 ps @165MHz, with a Note 4 saying the jitter is relative to IDCK.  I need help interpreting these specifications.

1. What does it mean when you say IDCK has only 2 ns of jitter tolerance?  The clock coming out of my FPGA is measuring about 80 ps of pk-pk jitter, does this mean I'm violating the 2 ns of tolerance on the IDCK input?

2. Note 4 says that the output clock jitter of 150 ps is relative to the input clock, IDCK.  Can you explain what this means in a little more detail?  For example, if IDCK has 80 ps of jitter, will this be additive to the TFP410-EP jitter?  That is, will the DVI clock output jitter be 80 ps (IDCK) + 150 ps (TFP410) = 230 ps of jitter?  Or are you saying that the TFP410 DVI output clock jitter is limited to 150 ps and that the input jitter and output jitter are NOT additive?

3.  The DVI output clock jitter max is specified at 190 ps over the -55 to 125C temperature range @ 165 MHz.  Do you have a jitter de-rating curve to show what the max jitter would be at 108 MHz over -40 to 85 C?

Thank you for your help.

  • I am researching this, and will get back to you shortly with some clarifications.
    Thanks,
    Wade
  • I have some more information.

    Unfortunately this is a fairly old device, and the history is not easy to obtain as most of the team members are no longer available.

    1. What does it mean when you say IDCK has only 2 ns of jitter tolerance?  The clock coming out of my FPGA is measuring about 80 ps of pk-pk jitter, does this mean I'm violating the 2 ns of tolerance on the IDCK input?

    The jitter characterization was performed with 2ns of jitter applied and operation verified.  Technically, this should not be listed as a typical.  It should be a max input jitter allowed.  Your 80 ps is well within the tolerance.

    2. Note 4 says that the output clock jitter of 150 ps is relative to the input clock, IDCK.  Can you explain what this means in a little more detail?  For example, if IDCK has 80 ps of jitter, will this be additive to the TFP410-EP jitter?  That is, will the DVI clock output jitter be 80 ps (IDCK) + 150 ps (TFP410) = 230 ps of jitter?  Or are you saying that the TFP410 DVI output clock jitter is limited to 150 ps and that the input jitter and output jitter are NOT additive?


    This one is a little more difficult to decipher.  Practically speaking there will be a jitter transfer function of the incoming jitter to the outgoing jitter and will not be simply additive.  It will vary by frequency of jitter.  Unfortunately, the jitter transfer characteristics of this generation of device was not performed.

    I believe the comment is intended to inform users that the IDCK is used to trigger the instrument capturing the jitter.  Thus the incoming jitter will be influencing the measurement.   The now defunct working group documentation can be reviewed by looking at the web archive.

    See: web.archive.org/.../DVI_TM_guide_REV1.pdf

    3.  The DVI output clock jitter max is specified at 190 ps over the -55 to 125C temperature range @ 165 MHz.  Do you have a jitter de-rating curve to show what the max jitter would be at 108 MHz over -40 to 85 C?


    We do not have jitter derating information at different frequencies or temperature ranges.

    If this answers your question, please click "Verify it as the answer"
    Regards,
    Wade

  • Thank you for the information.

    I think I'm still confused about the output jitter spec of 150-190 ps.

    The information seems to suggest that as long as my input jitter on IDCK is less than 2 ns, the output jitter on the DVI clock will be less than 190 ps.

    Assuming I'm understanding the DVI spec correctly, it says that the max jitter at the transmitter connector is 0.25 UI.  If my IDCK is 108 MHz, then Tpixel = 9.2593 ns, and 1 UI = 1/10 of Tpixel, which would be 925.9259 picoseconds.  So 0.25 UI would be 231.481 ps.

    So if 2 ns of jitter on IDCK was directly additive to the DVI output clock jitter, there is no way the DVI output clock would be within the jitter tolerance of the DVI specification (231.481 ps max in this case).

    So can I conclude that as long as IDCK jitter is less than 2 ns, the DVI output clock jitter from the TFP410-EP will be less than 190 ps?

  • I agree with your interpretation.

    Regards,

    Wade