This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

PRD period problem

I have created a PRD_Obj that I want to run every 10 ms.  So I set the period to 10.  But I only get called every ~69 ms.  So I set the period to 1.  Now I get called every ~69 ms.  Then I set the period to 100... and I get called every ~69 ms.  Seems to be a pattern here.

checked the Timer1 registers - they appear to be programmed correctly for 1 ms interrupt.

checked the PRD_Obj - period is set to 10, 1, 100 respectively

there is literally no other code running at this time.  I set various breakpoints to verify that.

any ideas what may be configured wrong?

  • Hi Kurt,

    What do you have configured for the DSP Speed In MHz (CLKOUT) in the BIOS config? Here is the help on this parameter:

    DSP Speed In MHz (CLKOUT). This number, times 1000000, is the number of instructions the processor can execute in 1 second. You should set this property to match the actual rate. This property does not change the rate of the board. This value is used by the CLK manager to calculate register settings for the on-device timers.

    Tconf Name: CLKOUT Type: Int16

    Example: bios.GBL.CLKOUT = 133.0000;

    Which device are you using?

    Todd

  • yes, that was all set automatically.  my value is 300.0000. and 1000.0000 microseconds/int. setup is good, problem is in execution

    c6748, ccs4.2.3, dsp/bios 5.41.3.17

    and apparently the hardware guy has changed the fpga.  I was not looking at the gpio output we are using to signal the interrupt on a 'scope.  when I look at the right signal, I am getting interrupts every 250 ms regardless of the period.

    more info.  I set the period to 20.  initially I get a burst of interrupts every ~12.5 ms.  then it changes to a steady 250 ms per interrupt.  sounds like my code is changing something.  but the initial 12.5 ms burst is confusing.

    in addition CLK_getltime() insists that only 20 ticks have occurred between interrupts.  I rebuild in Release mode just in case but get the same result.

  • Kurt,

    Can you confirm you are running the DSP at 300MHz? The easiest way to do this is load any program and run it. Halt it and look at the TSCH and TSCL registers in the Register View. The TSCH is the top 32 bit and the TSCL are the bottom 32 bits of a 64bit timestamp. For your device, this timestamp is a 1-1 correlation to the CPU speed.

    Run the target again for 10 seconds. Halt the target and look at TSCH & TSCL again. The difference between the second TSCH/L and first TSCH/L should be ~300,000,000 * 10.

    Todd

  • -now- the hardware guy tells me that the clock inpiut is really 24 MHz.  actually I ignored that part thinking the default value of 300 was correct.  it is not...

    when I enter 24.000 then everything happens at the right time.  oops.