This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

to determine sampling rate and sampling period

Dear All,

I have to sample from voltage and current signal in a practical project.

The main signal frequency is 50 Hz and the highest favorite frequency is 500 Hz.

I want to

- show the waveforms of voltage and current on lcd

- calculate the average power ( i. e. integration of voltage multiplied by current )

- calculate the rms of main and some harmonics in the signal

I need to filter the signals.

I think I have to sample by 50 kHz/channel ( 100 times to highest frequency ) to show waveform suitably. Is it true?

For calculating the rms values which period of sampling is required?

Is there any principle for this reasons?

With best regards,

Ras

  • Hello Ras!

    For your task the sampling frequency of 50 kHz (100 samples per period) is quite sufficient.

    In the general case, it all depends on your particular problem. If your measured signal has maximum frequency of 500 Hz and you want to get the spectrum of this signal, then for the Fast Fourier Transform is enough to have two samples for a period of maximum frequency (this corresponds to the Nyquist criterion), i.e. your sampling period should be no more than 1/(2x500 Hz) = 1 ms. But if you want to design a digital oscilloscope this sampling frequency of course is not enough because than the sampling frequency higher the more precise you get the shape of your waveform. Therefore, when you want determine the sampling frequency and the duration of this sampling you should look for a compromise between your RAM-resources and your needs. 

    Regards,

    Igor