Other Parts Discussed in Thread: TFP410
The TFP410-EP datasheet lists a jitter tolerance on the IDCK input of 2 ns (typical), and a DVI output clock jitter max of 150-190 ps @165MHz, with a Note 4 saying the jitter is relative to IDCK. I need help interpreting these specifications.
1. What does it mean when you say IDCK has only 2 ns of jitter tolerance? The clock coming out of my FPGA is measuring about 80 ps of pk-pk jitter, does this mean I'm violating the 2 ns of tolerance on the IDCK input?
2. Note 4 says that the output clock jitter of 150 ps is relative to the input clock, IDCK. Can you explain what this means in a little more detail? For example, if IDCK has 80 ps of jitter, will this be additive to the TFP410-EP jitter? That is, will the DVI clock output jitter be 80 ps (IDCK) + 150 ps (TFP410) = 230 ps of jitter? Or are you saying that the TFP410 DVI output clock jitter is limited to 150 ps and that the input jitter and output jitter are NOT additive?
3. The DVI output clock jitter max is specified at 190 ps over the -55 to 125C temperature range @ 165 MHz. Do you have a jitter de-rating curve to show what the max jitter would be at 108 MHz over -40 to 85 C?
Thank you for your help.