This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Understanding delays in TM4C123G TIVA 2.1

Hello guys. My previous post was about the delays passing in microseconds. But this a more fundamental question.  I am a complete novice in this regard. For example. If I pass the function SysCtlDelay(2000); in my program. Does it pass a 2000 millisecond delay or does it iterate for 3*2000=6000 cycles as mentioned in the data sheet. 

Lets say I am setting the systemclock as SysCtlClockSet(SYSCTL_SYSDIV_1 | SYSCTL_USE_PLL | SYSCTL_OSC_MAIN |SYSCTL_XTAL_16MHZ);

which gives me a value of 12.5 Mhz is I do SysCtlClockGet();

So how many seconds or milliseconds will the delay last?

Regards,

Keyshav

  • Keyshav,

    The answer to your doubt can be found on SW-TM4C-DRL-UG, the User Guide for Tivaware driverlib.

    If you look at the function description for SysCtlDelay, you will see that the delay loop is 3 instruction cycles long. So a SysCtlDelay(100000) at a 120MHz clock will take 2.5ms to return.

    SysCtlDelay is not extra precise, read the suggestions about using Timers, present on that same document.
  • Hello Bruno,

    When you mention 120MHz, I believe you are referring to the TM4C129x devices. On the TM4C129x device there is a catch. Due to the prefetch buffer structure if the SysCtlDelay and the calling function are in separate sections then the call itself will incur a delay due to the wait states during the fetch of the code.
  • Amit,
    I just used a clock figure as an example, and yes, the 120MHz example aplies only to TM4C129x devices.
    One of the Tivaware manual suggestions is in fact to use the ROM_ version, so that less uncertainty is inferred by calling stage.
    Still, if one is depending on SysCtlDelay for serious timing purposes, I'm pretty sure the International Olympic Committee will not accept the outcome as an official World Record!
    Cheers
  • Hello Bruno,

    Yes, using the ROM version would give predictable delay. But for timing sensitive protocol timer would be a better choice. However the code needs to be constructed such that the timer is managed correctly
  • Keyshav Mor said:

    SysCtlClockSet(SYSCTL_SYSDIV_1 | SYSCTL_USE_PLL | SYSCTL_OSC_MAIN |SYSCTL_XTAL_16MHZ);

    which gives me a value of 12.5 Mhz is I do SysCtlClockGet(); 

    Will it (really) give such a returned value?

    • Are your parameter values, "SYSCTL_SYSDIV_1 | SYSCTL_USE_PLL" proper?    Does not use of the PLL force a higher value upon SYSDIV?
    • And - there's inattention to detail here - what (real) benefit accrues from deviating from the 16MHz (alleged) board xtal?   (why seek 12.5MHz?)

    A scope, a GPIO toggled by (ideally an MCU Timer set to one-shot mode), provides a spectacular ability to monitor/verify the wide range of very usable delays - which persist under almost all MCU load conditions...

    Avoiding the (feared) complexity of the (simple) Timer set-up will not serve you well - long term.   The discipline which it enforces will likely speed, ease & enhance your further MCU learning experience...

  • Agreed on that cb1, especially given the plethora of timers on these devices.

    Robert
  • Friend Robert - might there be (almost) as many Timers as, "Mistaken understanding of deadband?"    (my head hurts...only a video can soothe)