This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

High VOL and Phase Noise measurement issue on CDC3RL02

Other Parts Discussed in Thread: CDC3RL02

We have prototyped a circuit using the CDC3RL02 to convert a +/- 1 V pk 10 Mhz sinewave to a 1.8V LVCMOS compatible output. 

We use the CDC3RL02 LDO ouput to level shift the 10 MHz  sinewave by ~1V before driving in to the CDC3RL02 MCLK_In pin. We have placed a 1 uf bypass cap at each of the following pins: LDO o/p and Vbatt.  We are only using one of the 2 O/Ps.

Here's our observations and concerns:

1.  The square wave O/p when observed with a 1 M-ohm|| 10 pf probe tip shows a VOL of ~ 600 mV above GND.

2. When trying to measure the phase noise by feeding the convereted square wave in to a 50 ohms input of an Agilent E5052B, the clock output gets heavily attenuated - +/- 250 mV pk.

The questions are:

1.  Why does the square wave clk o/p VOL shift up by 600 mV when measuring with a high impedance probe?

2. How were the Phase Noise numbers shown in the data-sheet measured ? Can this device drive a 50 ohm load?

Thanks,

JP