Hello all,
I'm trying to understand calculations of the DDC's overall gain and I have a trouble now with a formula for CIC_SCALE (bottom of page 18 of manual http://www.ti.com/general/docs/lit/getliterature.tsp?genericPartNumber=gc4016&fileType=pdf&track=no):
CIC_SCALE = 2^(SHIFT+SCALE+6xBIG_SCALE-62)
Corresponding to chart the CIC_SCALE is a block for shift input signal just before CIC-filter.
With the biggest values (SHIFT=7, SCALE=5, BIG_SCALE=7)
CIC_SCALE = 2^(7+5+6*7-62)= 2^(-8)
Is this means that from input signal (20 bits) only 12 bits is passed to the CIC?
I can't believe that we shift the signal so strongly even for the best case.
Why we have to calculate NCO and mixer with 20-bits precision if we shift it immediately to 12 bits?
Please, help me to understand this.