This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Regarding Delay function

Hi,

I require some 10 Sec delay in my main function.

right now I am using these lines in main function at starting.

/****** lines used inmmain function**************/

void main(void)
{
CpuTimer0Regs.TCR.bit.TSS=0; // start the timer
while(CpuTimer0Regs.TCR.bit.TIF == 0); // wait for 5 sec

InvInit();

EALLOW;
SysCtrlRegs.PCLKCR0.bit.TBCLKSYNC = 1;
EDIS;

while(1)

{

}
}

/*****************/////

But what I am observing is the Delay  how much I have configured is not observing when Powered On The controller.

It is happening after EPWM's generation.

But we require Delay before the generation itself.

Pls update.

  • Hi Ramakrishna,

    From your above shared code, its difficult to identify the exact problem. If you share more code related to Timer initialization and usage, it would be helpful for me understand the problem better.

    Also ,you can use function DELAY_US(uSec) to generate desired delay by adding DeviceX_usDelay.asm file to you project.
    You can refer controlSUIT example projects for detail info about function DELAY_US() usage.

    -Aditya
  • for ur info I'm sharing the code what I have used:

    void InitTimers(void)
    {

    #define DELAY_5_SEC_COUNT 5000000
    #define SYS_CLOCK_FREQ 60
    InitCpuTimers();

    ConfigCpuTimer(&CpuTimer0, SYS_CLOCK_FREQ, DELAY_5_SEC_COUNT);
    CpuTimer0Regs.TCR.all = 0x4001;

    }
  • Hi Ramakrishna,

    For which target device you are using?
    Whether you implemented Timer interrupt handler in your project?

    -Aditya
  • No,
    I don't want any timer interrupt.
  • Did you try with DELAY_US(uSec) macro?
    Also, note that : use 10 such calls with 1s of Delays. Let us know whether that works.
  • Yes I am using
    But my doubt is:

    Suppose if I want 5 Sec Delay
    What is the value I need to keep in the function DELAY_US(uSec)

    I didn't understood the calculation of the function.
    Please explain.
  • Hi,

    Suppose if I want 5 Sec Delay
    What is the value I need to keep in the function DELAY_US(uSec)

    You should pass 5000000 value for 5 Sec delay

    like DELAY_US(5000000)


    -Aditya

  • Yes I have passed that and I have toggled GPIO Also.

    But GPIO is not toggling.

    Any how I will try one more time.
    Can u please elaborate the calculation how it will give 5 sec.
    please.
  • Hi,

    The below thread may help you for your above query.
    e2e.ti.com/.../82437

    -Aditya
  • ya what I am saying is:
    Suppose if I have to keep 5000000 in the function:

    then the calculation would be [((5000000*1000*(10^6))/(16.67*(10^6)))-(9*(10^6))]/[5*(10^6)].

    This value how it will be equal to 5 Sec.

    I mean how I need to take conversion after calculating the value.

    This is what I didn't understood.
  • Hi,

    then the calculation would be [((5000000*1000*(10^6))/(16.67*(10^6)))-(9*(10^6))]/[5*(10^6)].

    This value how it will be equal to 5 Sec.

    The above calculation will provide the LoopCount for the Delay function called in ...usDelay.asm file.

    When you pass 5000000 value  for DELAY_US macro, the value will convert to LoopCount value for Delay function called in ...usDelay.asm file.

    For LoopCount calculation,

    Along with 50000000, CPU_RATE will be considered. CPU_RATE is based on your target device

    For one of the target, the ...usDalay.asm file looks like below

           .def _DSP28x_usDelay
           .sect "ramfuncs"
    
            .global  __DSP28x_usDelay
    _DSP28x_usDelay:
            SUB    ACC,#1
            BF     _DSP28x_usDelay,GEQ    ;; Loop if ACC >= 0
            LRETR 
    
    ;There is a 9/10 cycle overhead and each loop
    ;takes five cycles. The LoopCount is given by
    ;the following formula:
    ;  DELAY_CPU_CYCLES = 9 + 5*LoopCount
    ; LoopCount = (DELAY_CPU_CYCLES - 9) / 5
    ; The macro DELAY_US(A) performs this calculation for you
    ;

    For more clarification, please go to your target ...usDelay.asm file in you project.

    -Aditya

  • Yes sir,
    for my target device CPU Rate is 16.67L

    So, how 50000000 NUMBER is related to 5 Sec.

    For example the calculated value for the calculation:

    [((5000000*1000*(10^6))/(16.67*(10^6)))-(9*(10^6))]/[5*(10^6)]= y,

    how y, LoopCount, Delay_CPU Cycles, 5 Sec are related.

    I didn't understood here.
  • [((5000000*1000)/(16.67))-(9)]/[5]= y =59988000

    LoopCount 59988000 will be used in delay function to calculate 5 seconds delay.

    That means, for your target,
    For LoopCount = 1 will generate delay of (5/59988000 ) seconds. ie., Your target CPU will take (5/59988000 ) seconds  to execute one loop of Delay function.

    Delay function execution of CPU cycles are known (Constant) only the LoopCount number will be different for different targets as well as different delay time provided by the user.

    I hope you understand the calculation.

    -Aditya

  • Still if you are not clear with the above calculation,
    please calculate the number of CPU machine cycles to execute one loop of Delay function (DSP28x_usDelay). Then convert those CPU machine cycles to time for your CPU frequency. Then the time should match with -> (5/59988000) if you considered 9/10 cycle overhead.
    -Aditya
  • Also note that

    Use one of the values provided, or define your own.  The trailing L is required tells the compiler to treat the number as a 64-bit value.

    Example

    #define CPU_RATE   16.667L   // for a 60MHz CPU clock speed (SYSCLKOUT)

    -Aditya

  • Ok

    still not understood sir..

    Suppose if I want a delay of 1 Sec how do we calculate the passing value "uSec" into the function DELAY_US(uSec).

    Probably this way I may understand quickly.

  • And I am requesting u that U r explaining nicely.
    Please donot get frustrated for my doubts.
    As I am new to this I was unable to understand.
  • Hi,

    DELAY_US(uSec)
    Here the argument uSec will be in micro seconds.

    For example, if you want 1 Sec delay then you need to pass the value of 1000000 micro seconds. ie., DELAY_US(1000000)
    ie., 1 Sec = 1000000 micro seconds

    For example you want 10 mSec delay then you should pass the value of 10000
    ie., 10 mSec = 10000 micro seconds

    For example you want 100 micro Sec (uSec) delay then you should pass the value of 100
    ie., 100 uSec = 100 micro seconds

    So the argument value should be in micro seconds to get the desired Delay.

    -Aditya

  • No problem. We are here to learn and sharing our knowledge. :)
    -Aditya
  • Ok, then in my code why GPIO is not toggling.


    I have a code snippet like this:

    DELAY_US(5000000);
    GpioDataRegs.GPATOGGLE.bit.GPIO17 =1;

    I Should get a Square wave for 5 sec on the scope right.
    It is not happening..
  • Hi,

    DELAY_US(5000000);
    GpioDataRegs.GPATOGGLE.bit.GPIO17 =1;

    Have you called above code inside forever loop?

    Like below?

    for(;;)
    {
    DELAY_US(5000000);
    GpioDataRegs.GPATOGGLE.bit.GPIO17 =1;
    }

    If not please do & check the GPIO status

    -Aditya

  • Yes I have used Still.

    The function is not at all working.
    I am sharing the code snippet what I am using.

    void Init(void)
    {
    InitSysCtrl();

    InitEPwm1Gpio();
    InitEPwm2Gpio();


    DELAY_US(5000000);
    GpioDataRegs.GPATOGGLE.bit.GPIO17 =1;


    DisableDCDCPWM2();
    EnableDCDCPWM1();

    // DisableDCDCPWM1();
    // DisableDCDCPWM2();
    DINT;

    InitPieCtrl();

    IER = 0x0000;
    IFR = 0x0000;

    InitPieVectTable();

    EALLOW;

    PieVectTable.ADCINT1 = &adc1_Isr;
    // PieVectTable.ADCINT2 = &adc2_Isr;
    // PieVectTable.EPWM1_INT = &epwm1_isr;
    // PieVectTable.EPWM2_INT = &epwm2_isr;
    // PieVectTable.TINT0 = &cpu_timer0_isr;
    // PieVectTable.TINT0 = &FiveSecTimerISR;
    EDIS;


    EALLOW;
    SysCtrlRegs.PCLKCR0.bit.TBCLKSYNC = 0;
    EDIS;

    InitEPwm1();
    InitEPwm2();
    InitAdc1();
    InitTimers();
    InitAdc2();

    EALLOW;
    GpioCtrlRegs.GPAMUX2.bit.GPIO17 = 0; // 0=GPIO, 1=TZ3, 2=LINTX-A, 3=SPICLK-B
    GpioCtrlRegs.GPADIR.bit.GPIO17 = 1; // 1=OUTput, 0=INput
    GpioDataRegs.GPACLEAR.bit.GPIO17 = 1;

    GpioCtrlRegs.GPAMUX1.bit.GPIO14 = 0; // 0=GPIO, 1=TZ3, 2=LINTX-A, 3=SPICLK-B
    GpioCtrlRegs.GPADIR.bit.GPIO14 = 1; // 1=OUTput, 0=INput
    GpioDataRegs.GPACLEAR.bit.GPIO14 = 1;
    EDIS;



    // Enable CPU INT3 which is connected to EPWM1-3 INT:
    IER |= M_INT1;

    EnablePWMInterrupts();


    EALLOW;
    PieCtrlRegs.PIECTRL.bit.ENPIE = 1; // Enable the PIE block
    PieCtrlRegs.PIEIER1.bit.INTx1 = 1; // Enable INT 1.1 in the PIE for ADC
    EDIS;


    // Enable global Interrupts and higher priority real-time debug events:
    EINT; // Enable Global interrupt INTM
    ERTM; // Enable Global realtime interrupt DBGM
    }

  • Hi,

    I'm not able to find the repeatable call for below code in your shared code. So your GPIO17 is not toggling.

    DELAY_US(5000000);
    GpioDataRegs.GPATOGGLE.bit.GPIO17 =1;

    Also note that, GpioDataRegs.GPATOGGLE.bit.GPIO17 =1 statement will do only change the state of GPIO17 once.

    ie., if GPIO17 state is HIGH, then After executing GpioDataRegs.GPATOGGLE.bit.GPIO17 =1 statement, GPIO17 state will change from HIGH to LOW. If you call that statement again, then GPIO17 state will change from LOW to HIGH.

    GPIO17 will not toggle continuously if you call this (GpioDataRegs.GPATOGGLE.bit.GPIO17 =1) statement once.

    -Aditya

  • Ok. I will pass as above.