Let's imagine I'm digitalizing a 1MHz signal with the ADS8422 (4MSPS, 16-Bit).
Therefore, I need an Antialiasing filter with a Gain of 0dB @ 1 MHz and according to the dynamic range a Gain of -97dB @ 2 MHz.
A filter like this would be something like a 10th order Chebyshev filter and therefore introduce a quite significant amount of noise and error.
This might be an extreme example but it represents my question quite good:
Where is the point where an Antialiasing filter does more harm than it helps?
Is there a rule of thumb for what is the maximum usable order from a signal point of view?
Or is it really calculating the sweet spot between ADC errors getting larger and filter errors getting smaller with increasing ADC speed?
In addition, does the filter really have to block the full dynamic range of the ADC? If not, where is the sweet spot for that?
Thanks in advance.
Bets regards