This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CC2652RB: Regarding the difference between debug mode and normal mode of operation for the CC2652RB to achieve a precise timing of 50us

Part Number: CC2652RB
Other Parts Discussed in Thread: SYSCONFIG

Hi Team,

SDK:simplelink_cc13xx_cc26xx_sdk_7_10_00_98

Hardware: CC2652RB1F, external 32.768KHz crystal oscillator, the motherboard has been produced, there is no 48M crystal oscillator external crystal oscillator interface (the corresponding pin on the reference design is x)

Software: Based on \ti\simplelink_cc13xx_cc26xx_sdk_7_10_00_98\examples\rtos\LP_CC2652RB\ble5stack\simple_peripheral_oad_onchip

CCS12.3.0 

The last time you provided the solution for leveraging the Sensor Controller, because it may involve modifications to the overall design of the software, the changes are large and the development cycle is extended. This is now being advanced as a second version of the design.

There are some software optimizations, but the results are not ideal.

I found that when connecting the board for debug operation, the accurately timed waveforms were perfect, with little error

however, when the debugger is disconnected and running, the timing error is very large, and the timing is very erratic. Fluctuates between 10 us and 120 us

what is the difference between debug mode and normal mode of operation?

What configuration should I use to solve this problem?

Best Regards,

Galaxy