This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F280039C: passing a variable to DEVICE_DELAY_US() takes much longer time

Part Number: TMS320F280039C

Dear champs,

I am asking this for our customer.

The user has the same problem like 

https://e2e.ti.com/support/microcontrollers/c2000-microcontrollers-group/c2000/f/c2000-microcontrollers-forum/1114528/tms320f280025c-passing-a-variable-to-device_delay_us

"

When I write DEVICE_DELAY_US(100), the timing is correct.

When I define any type of variable UINT16_t, float, long double and call DEVICE_DELAY_US(variable) the timing is wrong.

Looking at it

#define DEVICE_DELAY_US(x) SysCtl_delay(((((long double)(x)) / (1000000.0L /  \
                              (long double)DEVICE_SYSCLK_FREQ)) - 9.0L) / 5.0L)

So, shouldn't whatever data type I input be converted to long double?

The timing always goes to ~2.4x 100 when I type to use any data type as input.

"

But it's not clear what the solution is in that post.

Do you have any suggestion if the user wants to use variable as a delay during runtime?

That is, the user just wants to delay 2.5us, 4 us, 6 us, like that during run time.

But these delay time are variables.

Can they use DEVICE_DELAY_US(var)?

Any suggestion here?

  • Hi Wayne,

    we can use the DEVICE_DELAY_US(var) function to provide delay as a variable.

    Please go through the i2c_ex4_eeprom_polling example in c2000ware-release\driverlib\f28003x\examples\i2c folder for reference.

    In the particular example we are passing variable value.

    uint16_t WriteCycleTime_in_us;      //  target write cycle time. Depends on target.                                  
    WriteCycleTime_in_us = 6000;    
    DEVICE_DELAY_US(WriteCycleTime_in_us);
    Please make sure the value passed as argument is the number of microseconds to delay.
    Thanks 
    Aswin