I am working on a small project using this Launchpad as a programmable timer.
I've found several times that making minor source code changes would unexpectedly cause SysCtlDelay() based times to roughly double.
For one example, this snippet of code is in a function that clocks data out to a serial 3-digit 7-segment LED display. I copied the upper section and made the bit assignments relative to defined symbols (DDat, DClk), rather than absolute positions (<<3). If I #define SHIFT (enabling the old code) all times are what I expected. If I #undef SHIFT or comment out the #define line (enabling the new code) all delays (including those in other functions) are twice what they should be.
What could be causing this kind of effect?
for (j=0; j<6; j++)
{
#define SHIFT
#ifdef SHIFT
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3));
SysCtlDelay(DLEDBitClk);
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3 | DClk));
SysCtlDelay(DLEDBitClk);
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)<<3));
SysCtlDelay(DLEDBitClk);
#else
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat:0));
SysCtlDelay(DLEDBitClk);
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat|DClk:DClk));
SysCtlDelay(DLEDBitClk);
GPIOPinWrite(GPIO_PORTA_BASE, DDat|DClk, ((i&1)?DDat:0));
SysCtlDelay(DLEDBitClk);
#endif
i>>=1;
}
SysCtlDelay(DLEDDigit);