Other Parts Discussed in Thread: AWR1642, SYSBIOS
Tool/software: TI-RTOS
I am trying to utilize the 200MHz clock from the microcontroller of the master subsystem to perform a timing function. I have flashed the debug image onto my AWR1642 device,c ompile and build the binaries in CCS, load the binaries in CCS, and run the application in CCS to visualize the output. I read into the documentation of TI SYSBIOS and found the following function definitions that I thought woudl help me achieve the 5ns timing resolution (since the period of a 200MHz clock is 5ns, I should be able to count time at intervals of 5ns):
Basically, I want to substitute the "CLI_write" with my own function and time how long the function takes (in nanoseconds) but I was confused about the output of this code. The first print statement returned "2" which led me to believe that the amount of time that elapsed between "delayAmount=Clock_getTicks();" and the print statement is 2*5ns=10ns because each tick of the 200MHz clock should correspond to 5ns. But, upon printing the tick period, I received the value "100000" which implies that the tick period is 0.1 seconds. How can I make sure that the tick period is the lowest possible value (5ns)?