Hi,
I am using eZdsp 2812 to control a DCDC converter and trying to make it work at 100KHz switching frequency. I need to sample three analog voltage by T1 periode interupt and calculate desired duty cycle from these values. I used source single step to measure time cost of each instruction, the result is shown below. The T1 clock is working at 1/4 system clk.
IL=((AdcRegs.RESULT0 >>4)+(AdcRegs.RESULT1 >>4));//cost 17 T1CINT 65 CPU cycle
IL=IL*ADCslope;//cost 14 T1CINT 55 CPU cycle
vin=(AdcRegs.RESULT2 >>4);//7 T1CINT 23 CPU cycle
vin=vin*ADCinVol;//9 T1CNT 34 CPU cycle
vout=((AdcRegs.RESULT3 >>4));//4 T1CNT 23 CPU cycle
vout=vout*ADCoutVol;//8 T1CNT 34 CPU cycle
I noticed that a single right shift instruction costs about 23 cpu cycles. Please help me to figure out What could I do to improve the computation speed?
P.S: The test is based on loading program to internal RAM and connected throgh JTAG. All the data is define as extern float.