This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CCS/TMDSEMU200-U: Length of trace buffer

Part Number: TMDSEMU200-U
Other Parts Discussed in Thread: TM4C1290NCPDT

Tool/software: Code Composer Studio

Hi, 

I'm using the XDS200 debugger with TM4C1290NCPDT. I have started statistical profiling at 1024 sampling,

120 MHz clock. Not sure what is the best sampling to use?

I am debugging an external reset issue (which will trigger at 1.6 seconds of inactivity in the watchdog toggle

pulse). What is the length of this SWO trace buffer? Will it save instructions for the past 3 seconds or something like that?

If this buffer is much shorter, I have external hardware to trigger an ISR as well.

Thanks,

Priya

  • Priya,

    The SWO Trace buffer is very small and the sampling interval of 1024 is, in fact, the amount of cycles between two data captures. Everything in between is "blind time". If the sampling interval is set too large, many events will be missed, while setting it too low cases buffer overflows or data communications errors. 

    Normally the tool already calculates the rates based on the clock frequency of the device, but by clicking on the "Advanced Settings" you can override these settngs. 

    Section 6 of the SWO Trace page below has additional details about the calculation and potential pitflalls when setting the interval rates, prescaler, etc. 

    http://processors.wiki.ti.com/index.php/SWO_Trace 

    Hope this helps,

    Rafael

  • 1. I am seeing overflow packets in the trace data stream. How do I avoid them? The first knob to exercise is the timestamp resolution. The timestamp resolution can be set at 4 different levels; divide by 1, divide by 4, divide by 16 and divide by 64. Divide by 1 gives the finest granularity and divide by 64 the lowest granularity but also fewer trace packets. To change the timestamp granularity, select the “Advanced Settings” button on the UI and select the Receiver in the settings window. Drill down to the Timestamping option and choose a higher resolution value.

    OVERFLOW! Please reduce samples generation rate or increase output clock rate.

    I am currently doing Statistical profiling.

    Does this mean time stamp 1 is the best to use? Default is 4. The error still doesn't go away set to 1. I also tried changing 1024 cycles sampling to as low as 256. The error continues to happen. When I generate an interrupt to look at the trace buffer, initially a few records show up and then several get added slowly. Why is this? Which hardware analyzer tool do I use to get a time stamp of when things are happening?

    I need little more answers than the link to the documentation.

    Thank you,

    Priya

  • After much searching I found this:

    I am hoping this will answer some of my questions.