HI Team,
Here my orignal question about LM97593.
I am using LM97593. I wants to know, are there any pitfalls with the device’s internal SFDR that would drive a change in the NCO frequency, and/or the CIC and FIR filter configuration? If he change the design from
Currently:
- Input RF is 21.400 MHz
- Information bandwith is 16kHz
- ADC clock rate is 22.050 MHz (requires decimation)
- Internal NCO frequency is 650kHz
- CIC and FIR filters are programmed to support decimation and our specific sensitivity requirements with the above hardware configuration. Settings for these filters were arrived at through consultation with the known National factory support engineer at the time (ca. 2008-2009), plus some trial and error optimization on our part.
To
Design Change – we are looking doing the following
- Input RF is 55.000 MHz
- Same information bandwidth
- ADC clock rate = 55.65 MHz. But could also be 54.35 MHz (VERIFY)
- Internal NCO frequency is 650kHz
- CIC and FIR filter programming – TBD (based on consultation)
He also wants to know if he would be better off using the low-side clocking of the ADC instead of high-side clocking.( He is currently using high-side clocking of the ADC.)
Below is the response I recieved for post
In jumping up to a higher sampling rate, but maintaining an IF 650kHz away from the sampling rate causes the aliased IF to be closer to DC (as a proportion of Fs) compared to the original situation. This can cause the original settings for the decimation filter to be inadequate for the new settings.
For instance, when Fs= 22.05MSPS and Fin = 21.4MHz, then IF frequency = 0.65/22.05 = 0.029*Fs and the BW=0.016/22.05=0.00073*Fs. In this case, one would design the NCO freq to be 0.029*Fs and the decimation filters to cut off at ~0.0010*Fs. After you change to Fs=55.65MHz, the IF frequency = 0.65/55.65 = 0.012*Fs and the BW is 0.016/55.65=0.00029*Fs which indicates that you should change both the NCO and the filters because the IF has proportionally shifted.
As for the SFDR performance, the HD2 and HD3 and IMD2 of the signal chain and ADC should all fall out of band and be digitally filtered out. IMD3 of the ADC (as well as the signal chain) will worsen slightly because you are running at a higher IF frequency (and higher sampling rate) with the same hardware. I would estimate that the IMD3 of the ADC would degrade ~3 dB relative to the carriers at the higher IF.
Whether one does low-side or high-side clocking is insignificant in this case because the IF is so close to the sampling rate. Chooing one or the other may make it more convenient to keep track of phase polarities in the digital domain because sampling in the 3rd Nyquist zone causes a phase inversion compared to the 2nd zone. There may be some 2nd order effects related to having a different sampling rate which may be significant, but you would need to try that in your system to really know if one is better than the other.
Regards,
Neeraj Gill