Other Parts Discussed in Thread: C2000WARE
When I write DEVICE_DELAY_US(100), the timing is correct.
When I define any type of variable UINT16_t, float, long double and call DEVICE_DELAY_US(variable) the timing is wrong.
Looking at it
#define DEVICE_DELAY_US(x) SysCtl_delay(((((long double)(x)) / (1000000.0L / \
(long double)DEVICE_SYSCLK_FREQ)) - 9.0L) / 5.0L)
So, shouldn't whatever data type I input be converted to long double?
The timing always goes to ~2.4x 100 when I type to use any data type as input.