This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Microsecond Delay TI RTOS



Hi,

I am trying to create a 30usec delay uing the following routine I found as a reference in one of the archived thread. 

void microsecond_delay()
{
  /* freq is timestamp counts per second */
  Timestamp_getFreq(&freq);
  ticksPerUsec = freq.lo / 1000000;
  if (ticksPerUsec == 0)
  {
   // Wait at least one tick
   ticksPerUsec = 1;
  }
  startTicks = ticks = Timestamp_get32();
  endTicks = startTicks + ticksPerUsec;
  do () {
      ticks = Timestamp_get32();
  } while (endTicks - ticks < ticksPerUsec);
}

I am not sure if this will work and not clear about the steps followed in the function. And is there a way to pass the user defined usec delay value using this function?

Looking forward to the response.

Thank you,

Arshiya Tabassum