I have a DSP-server codec running along with 2 background tasks. One day, I'm hoping to have a GPIO-driven interrupt that posts the semaphore that I discuss below. I have not yet written that code. Therefore, I'd like to take advantage of the situation to test the timeout case.
I have a define:
#define MSI_Timeout 2
The semaphore SEM_MSIready is created by my server.tcf file:
bios.SEM.create("SEM_MSIready");
bios.SEM.instance("SEM_MSIready").comment = "MSI notifies collect of control";
WaitForMSIReady does the SEM_pend:
DSP_Status WaitForMSIReady(void)
{
GT_0trace(tskMask, GT_1CLASS, "WaitForMSIReady\n");
if (SEM_pend(&SEM_MSIready, MSI_Timeout) == 0)
{
//Timeout occurred
return DSP_MSITimeout;
}
return DSP_Okay;
}
My code calls WaitForMSIReady:
if (WaitForMSIReady() == DSP_MSITimeout)
{
GT_0trace(tskMask, GT_6CLASS, ">>>>Timed out waiting for MSI to reach control<<<<\n");
return;
}
GT_0trace(tskMask, GT_1CLASS, "MSI successfully reached control\n");
I have placed a 16-second usleep call in my app-side test code. The thought was that the delay would give me plenty of time to catch the timeout before deleting the codec/ putting the DSP back in reset. I never see either GT_0trace statement (although a test statement works fine).
DSP_Status is a typedef enum and I'm able to return it from a test function and GT_1trace it. I figure that I'm missing some very basic initialization for the semaphore. Any hints are appreciated.