I have an ADC12D1600RF that I'm clocking at 1.28 GHz in DES mode (sample rate = 2.56 Gsps real). I'm generating the clock signal with an Agilent signal generator with a high stability time base. I'm generating a 1 GHz tone to be sampled with another Agilent signal generator with a high stability time base. The two signal generators are phase locked. I measure an effective number of bits (ENOB) of 5.5 for the 1 GHz tone. I was careful to calibrate the ADC prior to collecting data. I see all of the spurs identified in SLAA617 in the power spectrum of the collected data. The largest are the harmonic spurs.
I'm wondering why the ENOB I'm measuring is much less than the ENOB published in the ADC12D1600RF data sheet.
SLAA617 lists several on-chip tools for reducing spurs related to interleaving. SLAA617 suggests external dithering to minimize the harmonic spurs.
Can Texas Instruments provide more information about how they calculated ENOB in the ADC12D1600RF data sheet? Specifically, was dithering used when measuring ENOB?
I attached a Word document containing a plot of the power spectrum and identifying each of the spurs and their source. Does anyone have helpful suggestions how I can achieve an ENOB closer to what is published in the ADC12D1600RF data sheet?
Regards,
Scott