Dear All,
I have to sample from voltage and current signal in a practical project.
The main signal frequency is 50 Hz and the highest favorite frequency is 500 Hz.
I want to
- show the waveforms of voltage and current on lcd
- calculate the average power ( i. e. integration of voltage multiplied by current )
- calculate the rms of main and some harmonics in the signal
I need to filter the signals.
I think I have to sample by 50 kHz/channel ( 100 times to highest frequency ) to show waveform suitably. Is it true?
For calculating the rms values which period of sampling is required?
Is there any principle for this reasons?
With best regards,
Ras