This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

scaling factor in fft

Hello,

I would like to find spectrum power based on 16-bit 1024-size IQ samples.. In time domain I find the right answer, but in spectrum I can't. I think it is because of scaling factor, which I couldn't understand what value to use of. Please help me out on this..

Here is how i find FFT..

#define N 1024

#pragma DATA_ALIGN(wfft, 8);  
float wfft [N];

#pragma DATA_ALIGN(iq_sample, 8);

float iq_sample [2*N];

short table[N];

float fft_mag[N];

...

...

...

bitrev_index(table,N);
gen_twiddle(wfft,N);
bit_rev(wfft, N>>1);

DSPF_sp_cfftr2_dit(iq_sample, wfft, N);
DSPF_sp_bitrev_cplx((double*)iq_sample, table, N);

i =0;
for(; i<N; i++)
{
       fft_mag[i] = iq_sample[2*i]*iq_sample[2*i]+iq_sample[2*i+1]*iq_sample[2*i+1]; 

       totalpower += fft_mag[i];       
}

 totalpower = 10 * log10(totalpower);

Note: I found a code somewhere on the web, it takes scaling factor as 0.00065, which gives almost what i'm looking for.. But, where does this value come from?

Thanks a lot..