I have an application that requires me to sample at a specific multiple of the fundamental signal frequency. I understand that I will get a sampling rate of 32000, 16000, 8000, etc. if I use a master clock input of 2.048 MHz (period of 488ns) and that I can change the sampling rate by changing the master clock input frequency.
Looking at the ADS131 datasheet, it states that the minimum period for the master clock is 444ns and maximum is 588ns. What will happen if I run above the maximum period of 588ns? I need to sample a 50Hz signal at a sampling rate of 25,600 samples per second (50 Hz * 512 samples per cycle), and it appears the only way to do this would be to set the ADC into 32k sample mode, and input a master clock period of 610ns.
Here is how I derived the master clock period for my sampling rate:
tclk = 488ns / (1 - (Ideal sample rate - target sample rate) /(Ideal sample rate))
tclk = 488 / (1 - (32000 - 25600)/32000) = 610ns
I have tried running the eval board at this rate, and have not seen any adverse effects. Please advise if there may be something I am missing. Thank you.