I’ve been measuring the phase noise of the signal generated with the following setup using an Agilent E4407B spectrum analyser with Phase Noise measurement mode:
R&S SMU200A vector signal generator used as 400 MHz sampling clock signal for ...
ADI AD9956 DDS which is used to provide 100 MHz reference signal to ...
TI_LMX2492EVM (TI LMX2492 PLL + Macom MAOC-009265) generating a 9.5625 GHz signal which is input to ...
ADI HMC814LC3B x2 multiplier on a custom board to give 19.125 GHz output signal
I have also simulated the phase noise level in TI PLLatinum, the table below compares the measured and simulated results:
Frequency Offset from Carrier (Hz) |
Measured Phase Noise Level (dBc/Hz) |
Simulated Phase Noise Level (dBc/Hz) |
Disparity Between Measured/Simulated Phase Noise Levels (dB) |
1e3 |
-65.8 |
-75.2 |
-9.2 |
1e4 |
-82.0 |
-84.5 |
-2.5 |
1e5 |
-86.2 |
-91.6 |
-5.4 |
1e6 |
-92.1 |
-94.3 |
-2.2 |
With the exception of the 1 KHz offset I got fairly good agreement, do you know why I’m seeing such a difference at 1 KHz?
For the simulation I took the R&S SMU200A phase noise level from its datasheet (https://cdn.rohde-schwarz.com/pws/dl_downloads/dl_common_library/dl_brochures_and_datasheets/pdf_1/SMU_dat-sw-en.pdf, page 13) and estimated DDS output phase noise as shown below:
Offset from Carrier Frequency (Hz) |
Residual Phase Noise of DDS [at 100 MHz, estimated from datasheet] (dBc/Hz) |
Phase Noise of Reference at 400 MHz (dBc/Hz) |
Phase Noise of Reference at 100 MHz (dBc/Hz) |
Total Phase Noise (dBc/Hz) |
1000 |
-123 |
-114 |
-126.04 |
-121.25 |
1.00E+04 |
-131 |
-133 |
-145.04 |
-130.83 |
1.00E+05 |
-138 |
-136 |
-148.04 |
-137.59 |
1.00E+06 |
-142 |
-150 |
-162.04 |
-141.96 |
I took the phase noise of the Macom MAOC-009265 VCO from its datasheet, note that I didn’t change the loop filter from its default values. Also, I set up the simulation for 9.5625 GHz output and then added 6 dB to give simulated phase noise levels at 19.125 GHz. I could send through the TI PLLatinum simulation file.
Note that when I measured the phase noise of the SMU200A vector signal generator output at 400 MHz or DDS output at 100 MHz I didn’t get very sensible results so used results from datasheet/calculation to input these phase noise values into TI PLLatinum, I thought that maybe the phase noise of the Agilent E4407B spectrum analyser was impacting the measurement maybe:
- Measured phase noise of SMU200A vector signal generator: 1 KHz = -81.01 dBc/Hz / 10 KHz = -100.55 dBc/Hz / 100 KHz = -123.80 dBc/Hz / 1 MHz = -126.80 dBc/Hz
- Measured phase noise of 100 MHz output from DDS – 1 KHz = -82.16 dBc/Hz / 10 KHz = -101.03 dBc/Hz / 100 KHz = -122.76 dBc/Hz / 1 MHz = -127.30 dBc/Hz
When I measured the phase noise of the TI LMX2492EVM 4.8 GHz output with on board XO used as a reference – 1 KHz = -81.5 dBc/Hz / 10 KHz = -97.7 dBc/Hz / 100 KHz = -107.9 dBc/Hz / 1 MHz = -118.5 dBc/Hz.
Relative to the plot on page 7 of the TI LMX2492EVM user guide (copied below) the phase noise level is similar at 10 KHz/100 KHz/1 MHz offsets but is almost 10 dB higher at 1 KHz, i.e. similar difference to that observed with DDS-PLL measurement above – could it be that the Agilent E4407B isn’t measuring phase noise accurately at 1 KHz offset maybe?