This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Calculate min input power level of ADS800

Hi everyone,

I want to learn the minimum input power level for ADS800 analog to digital converter @5Mhz input signal. Can anyone help me how it can be computed?

  • Hi,

    I don't know what you mean about *minimum* input power level.  There is no minimum input level.  One of the tests we might do in the lab is what we call idle-channel where there is no input signal at all and we look at the resulting noise floor of the FFT of a captured buffer of sample data to see what the idle-channel noise is.

    Also, the ADS800 full scale input is defined in terms of volts - not power.  A certain amount of power into a given load leads to a certain voltage swing and unless we know the inpedance of your source into some termination load we cannot convert power into a voltage swing.  For example, a 10dBm signal into a 50 ohm load leads to a voltage swing of 2V peak to peak.  But that same 10dBm input power into some other impedance leads to some other voltage swing.  Knowing your source and termination impedance and the full scale definition of the ADC it is easy to convert back and forth then. 

    Regards,

    Richard P.

  • I mean the minimum signal power detected by ADC. I will use it in a RF front end system, so I want to learn the min power level that ADC can detect. 

  • Hi,

    I still don't have a different answer for you than the one I already gave.  There is no such thing as a minimum detectable signal power.  We have a defined full scale range specified in volts.  If the device is configured to have its default full scale range of 4V peak to peak differential and the resolution is 12bits, then each change from one sample code to the next sample code would ideally represent a change of 4V / 4096 or 0.98mV.  But since the effective number of bits is less than 12 bits, and there is some idle channel noise on the lsb's of the output with no input signal, your ability to reliably detect changes in the input voltage will be effectively coarser than that ~1mV that was just calculated.  And effective number of bits is related to SNR which is frequency dependent, so I still could not tell you what minimum changes in input voltage could be reliably detected without knowledge of the frequency or bandwidth of the input signal.

    And that is still in the voltage domain, and converting that to power would require knowledge of the impedance of your input circuit.

    Regards,

    Richard P.

  • Hi,

    I have understood. The reason to ask this question is I want to design RF front end circuit. The antenna will recive 145MHz signal with recieve sensitivity -100dBm. Then the signal will pass from antenna to ADC. The input signal of the ADC will be 4MHz, QPSK modulated and 25kHz bandwidth. To calculate what will be the gain of RF stage, I asked this question (ADC's min detectable power). So ı choose amplifier after this calculation.

    Do you have any test circuit about this situation which will help me to design RF front end or can you recommend any idea how can I calculate this? 

    Thank You,

    Regards,