This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Latency of a frequency detection algorithm



I'd like to take a 1-100 kHz mixed-frequency signal and use this DSP to detect the presence of a specific (user-modifiable) frequency.

I'm wondering what kind of latency I should expect from an A-to-D plus this DSP performing an FFT on a signal in that frequency range.