Other Parts Discussed in Thread: C2000WARE
Hello!
I am designing a power supply and using the F2837xD for control. I need to achieve a maximum voltage ripple of 100mV for an output voltage of 6kV.
For an ADC to achieve the voltage resolution required, the following equations hold: ADC Resolution: 6kV 100mV = 60K million levels. The DSP far as I remember is 12 bits maximum.
(4.1) For an ADC with 12 bits: 2^12 = 4096, but I would require an ADC with at least 16 bits of information to give 65,536 levels.
A 12 bit ADC is therefore not a feasible way of measuring the cathode voltage to an accuracy of 100mV as it would not realistically be able to measure 60K levels accurately.
With this series of DSP, can anyone recommend any ways to achieve this accuracy? Is there any ways to use multiple 12 bit ADC's, for example triggered in sequence? Each ADC has 8 channels, and there are 4 ADC's on the board. There must be some way to get over this limitation of standard feedback control when the DSP has so many channels available for sampling, I just can't get my head around the best solution on how to trigger them to achieve it? Can timers be used to stagger the ADC's and oversample to this type of level of accuracy?
I would ideally be able to push beyond this 60K levels and 100mV ripple, to even say 10mV, which would require 600K levels, but I may be being optimistic there?
Any feedback is appreciated!
Regards,
Joel