Hello guys. My previous post was about the delays passing in microseconds. But this a more fundamental question. I am a complete novice in this regard. For example. If I pass the function SysCtlDelay(2000); in my program. Does it pass a 2000 millisecond delay or does it iterate for 3*2000=6000 cycles as mentioned in the data sheet.
Lets say I am setting the systemclock as SysCtlClockSet(SYSCTL_SYSDIV_1 | SYSCTL_USE_PLL | SYSCTL_OSC_MAIN |SYSCTL_XTAL_16MHZ);
which gives me a value of 12.5 Mhz is I do SysCtlClockGet();
So how many seconds or milliseconds will the delay last?
Regards,
Keyshav