Hi team,
I would like to ask about Worst-case differential input clock jitter tolerance, t_ijit, 50ps min.
1,
Does it mean that display goes wrong when measured jitter is over 50ps when the case of (5) Measured differentially at 50% crossing using ODCK output clock as trigger?
Or 50 ps jitter will be additive if ideal clock go through? In the latter case, input jitter is required to meet TMDS compliance?
In the former case, customer think it's hard to meet TMDS jitter requirements, 0.25Tbit.
2,
I'm not sure but customer have listened t_ijit will be 133 ps if 50MHz clock. What does other factors cause the t_ijit value?
3,
There is sometimes 1341pix/1347pix, not 1344pix, at a period of HSYNC. In this case, display show nothing.
Customer think HSYNC jitter immunity would affect this issue. Do you think it is possible that there is a shift of HSYNC if large jitter exists or other factors?
Regards,
Hideki