This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

LMX2582: PLLatinum Sim "Optimize Jitter" strategy does not minimize jitter

Part Number: LMX2582
Other Parts Discussed in Thread: ADC12DJ3200EVM,

I am currently using the ADC12DJ3200EVM which uses the LMX2582 to generate the ADC clock. I want to use PLLatinum Sim to estimate the ADC clock jitter.

I would like to sample signals from DC to 5GHz with an ENOB of 9-bits, so the RMS jitter target is tj < 1/(2*pi*Fin(max)*2^B) = 62.2fs.

PLLatinum Sim shows that the LMX2582 device can meet this jitter requirement, however, while using the tool I noticed some weird issues that I hope TI can comment on.

The following sequence of steps demonstrates an example of the issues;

1. Start PLLatinum Sim and select the LMX2582 device

2. Set Feature Level to Advanced

3. Select the "Phase Noise" tab and turn on OSC noise by selecting "Use Metrics"

4. Change the VCO frequency to 5000MHz, so that the output clock changes to 2500MHz. This is the nominal sample rate I plan to use the ADC at.

5. Change the PFD frequency to 200MHz to turn on the input doubler, and change the MASH order to 1 to clear the feedback divider warning (red background).

6. Change Kpd to 5mA.

7. Select the "Filter Designer" tab, check the "Loop Bandwidth" Auto checkbox, change the "Auto Parameter Strategy" to "Optimize Jitter", and click "Calculate Loop Filter"

The calculated jitter is 6285fs. This is a strange GUI bug, where the jitter estimate is wrong. Under the "Phase Noise" tab disable and re-enable the OSC and the jitter recalculates to 109.1fs. Disabling OSC drops the jitter to 108.9fs, so the OSC does not contribute a great deal of jitter in this configuration. Repeating steps 1 through 7, skipping step 2, and the calculated jitter is 108.8fs, and then enabling OSC increases the jitter to 109fs. This is a PLLatinum Sim bug.

My assumption for "Optimize Jitter" was that it would target minimum jitter, however, that is not the case. For Kpd = 5mA with Tj = 190fs, if I now increase Kpd, the jitter is reduced. For example, if I change Kpd without recalculating the loop filter the jitter reduces; Kpd = 9.688, Tj = 74.95fs, Kpd = 19.375mA, Tj = 58.14fs, and Kcp = 24.219mA, Tj = 55.9fs. The phase margin at these new Kpd settings is fine. If the loop filter is recalculated the jitter increases slightly.

Why does the "Optimize Jitter" strategy not sweep the loop bandwidth to find the minimum jitter?



  • Hi All,

    Another thing I have done while investigating is to see if the minimum value of C3 = 3.3nF is somehow influencing things.

    I manually entered the Loop bandwidth and Phase margin, and  I changed Capacitor Step Value and Resistor Step Value to Ideal. Now when I click "Calculate Loop Filter", if the bandwidth is not exactly as specified, I can tell that the C3 = 3.3nF limit is the reason. However, I can increase Kcp, click calculate, and then iteratively adjust the loop bandwidth and phase margin to reduce the jitter. For example, Kcp = 24.219mA, Loop bandwidth = 110kHz, Phase margin = 50-degrees, click "Calculate Loop Filter", and Tj = 61.31fs.

    There are way too many combinations of settings to figure out the minimum jitter configuration manually.

    I was able to get the jitter below 60fs for the loop filter components on the ADC, but of course the real components on the board will have a different phase noise response, so I will need to measure the clock output on J20 to really find the best combination of settings.



  • Hi Dave,

    I reproduced the issue you're describing, and there is something going on with the reference input noise being incorrectly scaled during this calculation. I'm going to pass this on to the maintainer.

    "Optimize Jitter" as you've configured it, with everything autoset and no filter optimizer running, will find the VCO/PLL crossover point, assuming no minimum VCO capacitance requirement, to use as the target bandwidth. Then, later on, if a minimum capacitance is required, it will be inserted regardless of how it affects the loop bandwidth (as long as the filter is stable). Finally, it will round everything to achievable capacitor and resistor values. You'll note that if the minimum high order cap is set to 0, you'll get a much lower C3 value and the bandwidth will be closer to optimal. That aside, depending on the integration bandwidth or the filter order, there may be better strategies for selecting a more optimized jitter.

    If you don't believe that selecting the PLL/VCO crossover point will yield the optimal jitter, first set your integration bandwidth to the correct range for your application on the phase noise tab. Then you can instead use the "Performance Summary" group on the filter designer tab, and set the optimizer to jitter - this transforms the group into the filter optimizer setup group. Make sure you set one of the parameters to jitter as well, and set a maximum value for the optimizer - it will fail if no parameters are set, or if the limit value for a parameter is zero (it will look for the lowest value regardless of the limit, it just needs to be set). Then you can vary the filter parameters time, the forced component values time (leave at the leftmost for LMX2582 with no forced parameters), and set the max calculation time on the left below the feature level settings. You will also be able to set the minimum and maximum sweep ranges for loop bandwidth, phase margin, gamma, and pole ratios for higher order filters if they are autoset. Now if you click calculate loop filter, the optimizer will test many different values and try to achieve the best outcome. Note that because the application runs the optimizer in the same thread as the UI, the application may hang temporarily... this is expected, and you should make sure to set the max calculation time to whatever you're willing to stomach in case the optimizer runs into the weeds. To get a basic feel for what it does, I recommend starting the filter parameter and forced component time sliders all the way to the left, then program some sensible limits for the loop bandwidth and the phase margin, and set the max calculation time no higher than 60s (30s works for me on an intel i5-8000 series laptop).


    Derek Payne

  • Hi Derek,

    Thanks for confirming the OSC noise bug. Thanks for the additional insights into PLLatinum Sim. I had already been playing with setting the C3 minimum to zero. I'll play with the additional settings you recommend. Are your comments documented somewhere? I'd be happy to read through a PLLatinum Sim Users Guide, but I have not seen anything. Banerjee's book is of course a very useful reference! Though there are some equation errors

    1. Eq 38.8 on p340 is correct, Eq 38.18 on p342 is wrong (missing the squares on the time constants)

    2. Eq 38.9 on p340 is correct, Eq 38.19 on p343 is wrong (has T2 and T1 switched)



  • Hi Dave,

    There's no official user's guide for PLLatinum Sim; the best documentation is in the little "?" boxes scattered around the GUI, and I don't think they do a good job of showing users how to set up the GUI for more complex cases such as these. As far as I know, I've only ever described the process of using the value-substituting auto-optimizer on E2E. That said, you're far from the first person to ask about a user's guide for PLLatinum Sim. I've personally been asking Dean to write one, with the more complex procedures explained, for a few years... In any case, given how many people have asked, I may wind up being the change I want to see in the world sometime in the next four months.

    Good catch, I'll let Dean know about the math errors too.


    Derek Payne

  • Hi Derek,

    Thanks for the hints on how to use the GUI. I've got the hang of using the advanced optimizer to determine the minimum jitter.

    Returning to your comment above:

    "Optimize Jitter" as you've configured it, with everything autoset and no filter optimizer running, will find the VCO/PLL crossover point, assuming no minimum VCO capacitance requirement, to use as the target bandwidth.

    I think this makes the tool more confusing than necessary. Could you please ask the developer to use the minimum VCO capacitance requirement when running the autoset routine? It is only once you change the GUI into "Advanced" feature level mode that you even see that there is a minimum VCO capacitance parameter. Assuming that the "Advanced" page has a reasonable default value for the min VCO cap. it seems reasonable for the autoset routine to use it. If that was the default, then I would not have been quite so confused.


  • Hi David,

    Sure, we will pass your feedback to the developer. Thank you for you suggestion.

  • Thanks Noel! Regards, Dave.