This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Launchpad LM4F120 timing changing unexpectedly

I am working on a small project using  this Launchpad as a programmable timer.

I've found several times that making minor source code changes would unexpectedly cause SysCtlDelay() based times to roughly double.

For one example, this snippet of code is in a function that clocks data out to a serial 3-digit 7-segment LED display. I copied the upper section and made the bit assignments relative to defined symbols (DDat, DClk), rather than absolute positions (<<3). If I #define SHIFT (enabling the old code) all times are what I expected. If I #undef SHIFT or comment out the #define line (enabling the new code) all delays (including those in other functions) are twice what they should be.

What could be causing this kind of effect?

    for (j=0; j<6; j++)
    {
#define SHIFT
#ifdef SHIFT
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3));
        SysCtlDelay(DLEDBitClk);
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3 | DClk));
        SysCtlDelay(DLEDBitClk);
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3));
        SysCtlDelay(DLEDBitClk);
#else
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat:0));
        SysCtlDelay(DLEDBitClk);
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat|DClk:DClk));
        SysCtlDelay(DLEDBitClk);
        GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat:0));
        SysCtlDelay(DLEDBitClk);
#endif
        i>>=1;
    }
    SysCtlDelay(DLEDDigit);

  • Dave,

    What speed are you running the processor at? If you run it at 40MHz is there still the different behavior? If so, it may be related to running the clock faster than to 40MHz flash speed. I'll look into it and see if I can give you more detailed answer tomorrow.

    If you need consistent timing for your application, a timer interrupt would be the best way to go.

  • Ah!  Possibly the case. I believe (copied from an Example project) I'm running at 50 MHz.

    These particular timings are not critical, just relative delays for bit timing. Timer count-down (1 second intervals, possibly 100mS) will use a timer interrupt.

    This is the init code I copied:

        //
        // Set up the system clock to run at 50 Mhz from PLL with crystal reference
        //
        SysCtlClockSet(SYSCTL_SYSDIV_4|SYSCTL_USE_PLL|SYSCTL_XTAL_16MHZ|SYSCTL_OSC_MAIN);

  • Dave,

    That code would get you running at 50 MHz. Change the SYSDIV divider to 5 to set the clock to 40 MHz and see if you get the same behavior.

  • I did, and the times now runs something like 125% of what they did when it was running "right", so I think that was it.

    The weird part was the unexpected changes from tiny code mods.

    The new version probably (I haven't scoped out the asm yet) compiles to a conditional and pre-compiled constants. The old code included a shift and  a logical operation. Not much, but apparently enough!

    Thanks for the help!