Hi,
For our Satellite modem Receiver design
Requirements are :
SNR: 70db,
IF (Fin) : 70 Mhz;
We are using 16 bit ADC16v130 in our design.
To meet the SNR of 70 db , clock jitter is calculated using this formula:
SNR =-20*log(2*pi*Fin*Jitter RMS);
from the above formula i could see total Jitter RMS = 718 fs;
for ADC16v130 Aperture jitter : 80 fs (Datasheet);
Hence, external Clock jitter = sqrt(total jitter 2RMS -Aperture jitter2) = 713 fs;
Clock scenario in our design :
10 MHZ TCXO reference --> LMK00308 (clock buffer,10Mhz CMOS output) --> LMX2581 (fractional pll- output 392 Mhz) --> lmk1801(divider buffer- output 98 Mhz) --> clock to ADC 98Mhz
As per Datasheet additive RMS jitter :
LMK1801 : 50 fs (intg.BW :12khz to 20 Mhz)
LMX2581 :100 fs (intg. BW 100 Hz to 10 MHz)
LMK00308 : 51 fs (intg.BW : 12khz to 20 Mhz)
for TCXO based on the phase noise values mentioned :
Phase Noise, 10.000MHz (dBc/Hz)
10Hz -94
100Hz -118
1kHz -135
10kHz -147
100kHz -152
Based on these values RMS jitter comes out to be : 1.5ps (intg.BW 10HZ -20 MHZ)
Clock jitter to ADC : crystal jitter + lmk00308 + lmx2581+lmk1801 = 1.6ps , but the requirement is only 713fs
Is this the correct procedure in calculating the clock jitter ?
For ADC clock :98 Mhz , what should be the intg.BW to be taken to calculate RMS jitter from Phase noise plot seen in R&S analyzer ??
Is it from 12khz to 20 Mhz or 10 Hz to 20 MHZ ??