This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

"CPU-Frequency change" for

Hi,

I'm using OMAP137 with a Linux kernel in the GPP and DSP/BIOS in the C6747. (CCS 3.3.81.11, BIOS 5.33.05, CGT 6.1.10)

I have tried to use the following code (below) in the C6747 to "change" the CPU-frequency in my system but it will only change the timerinterrup "lengths" once even though I call this function every second. I would like to change it more than once because my system syncronize on the received radio messages.

The function change_DSP_freq is called within a function that is called after a PRD-interrupt. The "Change_Ok"-parameter indicates that everything is ok every iteration but nothing happens.

What do I miss? Any suggestions?

#include <std.h>
#include <c64.h>
#include <hwi.h>
#include <gbl.h>
#include <log.h>
#include <clk.h>
#include <tsk.h>
#include <prd.h>

void Change_DSP_Freq(Uint32 cpuFreqInKhz)
{
 Uint8 change_Ok;
 Uint32 oldmask;
 
 oldmask = HWI_disable();

 GBL_setFrequency(cpuFreqInKhz * 0.9);  // Decrease the frequency 10%...
 
 CLK_stop();    

 change_Ok = CLK_reconfig();
 
 CLK_start();

 HWI_restore(oldmask);

 LOG_printf(&Trace, "status: %d", change_Ok);

}

  • Hi Marcus,

    Please see page 174 of the document: TMS320C6000 DSP/BIOS 5.32 Application Programming Interface (API) Reference Guide

    "This function sets the value of the CPU frequency known to DSP/BIOS."

    So this function just passes the information of the frequency that was set at the PLL to DSP/BIOS. It does not actually changes it.

    For PLL settings please see sections 6.6 of the OMAP-L137 datasheet.

    The clock that goes to the DSP is SYSCLK1. But some other clocks that go to peripherals are related to SYSCLK1. See page 67 of the datasheet. So basically, you would need to change the PLL registers and would be altering the speed of a lot of peripherals and not only the DSP cpu. Also, the ARM CPU speed should be the same as the DSP cpu freq. If you alter one, you would need to alter the other. Even if it is possible to do that, I would not recommend to do in the run-time.

    Please also look at chapter 7 of the document OMAP-L137 Applications Processor System Reference Guide. Section 7.2.2 has the steps for changing the PLL dividers.

     

     

  • Hi,

    I know that the function doesn´t change the actual CPU-frequency. Although it sems that the value is used by the CPU to calculate the timer interrupts configured in DSP/BIOS. In my simple example I have a timer interrupt of 1ms and if I change the frequency according to the code above the timerinterrupt will be 0.9ms after the function call. This just works once otherwise it would have changed it to 0.81ms in the next iteration.

    On page 2-67 (CLK_reconfig) in TMS320C6000 DSP/BIOS 5.32 Application Programming Interface (API) Reference Guide the following text is stated:

    ....It computes values for the timer period and the prescalar registers using the new CPU frequency. The new values for the period and prescalar registers ensure that the CLK interrupt runs at the statically configured interval in microseconds.

    My goal with this function call was to change all my timer interrupts to syncronise with the received messages, i.e I will "fool" the system to think that 1.00001ms is actually 1.00000ms.

    What is the purpose with this API-function otherwise?

  • This API function is to tell DSP/BIOS the CPU frequency. In case you change it using the PLL modifications I mentioned. So it adjusts the timer configurations to match the new frequency. If the value you pass do not match the actual CPU frequency you are going to get inconsistent results.

    Marcus said:
    My goal with this function call was to change all my timer interrupts to synchronize with the received messages, i.e I will "fool" the system to think that 1.00001ms is actually 1.00000ms.

    Not sure if I understand what you mean. What is the relationship between the messages you are receiving and the timer interrupts? Why do you need them to be synchronized?

  • My system time is based on the time between the received messages and they vary in time, just a little (typically ~ 50us per second) but still enough to loose syncronisation after a while. Lost syncronisation equals lost connection to the radio system.

    My timerinterrupts are defined in DSP/BIOS and are fixed. I need them to be "flexible"  to keep track of the time difference of the received messages. If I knew the time difference in advance I could have changed the DSP/BIOS configuration for the timers, but I don´t so that´s why I need them to be flexible. The flexible frequency change will be approximately up to 0.01%.

  • I see that you are passing the argument cpuFreqInKhz from the calling function and then multiplying it by .9, but you are not returning the updated value. How, then, does the calling function know that the new BIOS frequency is 90% of the original frequency?

    As Mariana mentioned this is not exactly a 'normal' procedure but it should still be doable. Because the DSP's frequency will remain static you are only tweaking how often the system 'tick' occurs (by default it is set for 1ms, so you will be changing this to something like 0.99ms per 'tick').

  • Sorry! I was not clear enough in my earlier question but I do the cpuFreqInKhz  = GBL_getFrequency() before I call the Change_DSP_Freq-function in the next iteration so it should be correct.

    I guess it is not standard procedure but it simplifies my system syncronisation a lot. I thought that it perhaps could be something wrong with my "Clock Manager Properties" in BIOS that is blocking several frequency changes?

    I use following settings:

    Object Memory: IRAM

    Timer Selection: Timer 1

    Use High Resolution Timer for Internal timings: Yes

    Specify Input Clock Rate: Yes: 300MHz

    Reset Timer and TIMEMODE: Yes (32 bit unchained)

    Directly configure on chip timer register: Yes PRD-Register=1000

    It´s a simple test to perform if you want to try and see if it is working for you [:D]

  • I did go ahead and try this out using the code you pasted. I read the current frequency value in a task, then called into your function. With this method I am able to see the value of cpuFreqInKhz change by 10% every pass. Below is the task I used to call into your function:

    void taskZero(){
       int cpuFreqInKhz; 

       while(1){
          cpuFreqInKhz = GBL_getFrequency();
          Change_DSP_Freq(cpuFreqInKhz);

          TSK_sleep(5000);
       }
    }

    The first time through the while loop cpuFreqInKhz = 300000. The second pass it equals 270000, then 243000, then 218700, etc.

  • Yes that´s exactly how my example also works. But if I measure my timerinterrupt with an oscilloscope (flashing LED:s) I can see that it has only been changed once even though cpuFreqInKhz changes in each iteration. That´s the strange issue....

  • Marcus,

    I am not sure that the GBL_setFrequency() function fully supports the C6747. What I mean by this is that other processor's timer module was a derivative of the main system clock (SYSCLK1); however, the C6747's Timer is clocked via AUXCLK which is essentially a bypass of the PLL. This means that when the CPU operates at 300MHz the Timer is still operating at CLKIN (on the EVM this is 24MHz).

    On something like that C6713 the SYSCLK1 value would have a direct impact on the timer input frequency, so when this value changed in BIOS the clock value would need to change as well. Because the C6747's timer clock is not affected by changes to the PLL changing what BIOS thinks is the CPU frequency via GBL_setFrequency actually does not seem to have any impact.

    I think the only way to go about changing this would be to modify the Timer1 PRD12 register directly via something like this:

    CLK_stop();
    (*(int *)0x01C21018) *= 0.99;
    CLK_start();
    *edit* This code will not work because CLK_start() reconfigures the PRD register. Let me research a bit more...

  • Marcus,

    On the 6747 EVM the Timer frequency is assumed to be fixed at 24MHz and the timer register value is computed based on this frequency. For a 1ms tick the PRD value is 24000. The tick period (CLK.MICROSECONDS) or the input frequency to the timer (CLK.INPUCLK) can be changed in the tcf file to change the system tick value, but there is no API to change timer input frequency or tick period at run-time.

    Sorry to be the bearer of bad news here.

  • Hi Tim,

    Thanks a lot for the help so far[:D]

    ...fixed at 24MHz....Ok! That´s probably the explanation to one of my other problem. My timerinterrupt was not affected if I changed the Specify Input Clock Rate to 300MHz or to 24 MHz. The only thing that matters was the PRD-Register-value:  (Directly configure on chip timer register: Yes PRD-Register=1000). So when I did a simple calculation of the used CPU frequncy for mytimerinterrupts it was 24MHz and I didn´t know why. Thanks for clarifying that issue.

    Regarding: This means that when the CPU operates at 300MHz the Timer is still operating at CLKIN (on the EVM this is 24MHz). So if it is fixed to 24MHz on the EVM-board, what will it be on the real CPU that we will use in our hardware (24MHz or 300MHz)? Do I need to change my PRD-settings?

     Just for the clarification: The API: GBL_setFrequency can not be used together with the DSP-6747, correct?

  • Marcus said:
    My timerinterrupt was not affected if I changed the Specify Input Clock Rate to 300MHz or to 24 MHz. The only thing that matters was the PRD-Register-value:  (Directly configure on chip timer register: Yes PRD-Register=1000).

    I was under the impression that you should be able to modify the 'Specify Input Clock Rate' to change the PRD configuration. Judging by the contents of the configuration file if I manually specify 25MHz in this field then the PRD value changes to 25000. Are you seeing different behavior?

    Marcus said:
    So if it is fixed to 24MHz on the EVM-board, what will it be on the real CPU that we will use in our hardware (24MHz or 300MHz)? Do I need to change my PRD-settings?
    It's fixed to 24MHz in the EVM seed file because the EVM uses a 24MHz input clock. If your custom board uses a different clock - say 30MHz - then the Input Clock Rate should be changed to 30.000.

    Marcus said:
    Just for the clarification: The API: GBL_setFrequency can not be used together with the DSP-6747, correct?
    Correct. While the function will still execute and modify what BIOS knows as the DSP Clock this does not actually affect the Timer clock speed.

  • Sorry! My mistake, I see that behaviour.

    Thanks again for your superb help.