Hello there,
I read in literature that jitter at the output of a high speed communication driver is often modelized by a Dual-Dirac distribution convoluted with a Gaussian distribution. This modelizes total jitter distribution on the output transmitter.
Does actual signal have such a jitter distribution? Does it exist better distribution model?
In addition, in order to specify timing requirements for a LVDS system interconnection, I wonder whether the fact of saying "jitter is equally distributed on both side of an ideal signal edge" is a good assumption. Let's have an example with numbers: a transmitter is specified for producing 200ps peak-to-peak jitter to its LVDS output (let's say its BER is 1e-12). If we consider the dual-dirac+gaussian distribution model, as this is equilibrated on its both side, we might probably say that the transmitter produces 100ps left jitter and 100ps right jitter.
However, as I guess jitter distribution is not perfectly balanced in an actual signal, I am wondering how much the model is unbalanced and thus how much error we may do by assuming an equilibrate distribution (if it has ever been quantified, in percent (%) or ps).
Thank you for responding
Best regards,
Julien A.