This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
My DSP application need to process Interrupt on GPIO pin at 10MHz frequency.
I am using Code Compressor studio for the project and with the help of debugger I am downloading the DSP bare metal application on C66x. The AM5728 EVM is halted in u-boot. The Gel scripts are configuring the DSP at 600MHz. I am stuck at a point where I am not able to process interrupts at more than 10 KHz when using debugger.
The setup details are:
With the help of the external waveform generator I am giving square pulse (50% duty cycle) to GPIO port 3 pin 11. This pin is configured as input pin and interrupt on rising edge. The interrupt ISR has only 4 instructions:
GPIO_disableInt(interrupt_pin);
GPIO_clearInt(interrupt_pin);
GPIO_toggle(debug_pin); /* This is used for debug purpose */
GPIO_enableInt(interrupt_pin);
If i give external square wave of frequency of 10KHz the program runs fine. It executes the ISR and comes to the main function to do the further processing. If I increase the external square wave frequency beyond 20KHz , the program always remain in ISR. Before it finishes ISR request, it receives another rising edge. So as soon as it enables the interrupt it again enters the ISR. The program never gets time to come to main for processing.
I am not able to understand why the execution of ISR is taking so much time when the DSP is running at 600MHz. How can I confirm the DSP operating frequency. Is there a way to check how much CPU cycles does each instruction takes.
Thank you.
Regards
Indra
The DSP contains TSCL and TSCH time stamp counter register that you can read to measure time taken by each instruction. Since this is your own bare-metal code there are several issues that could be causing this issue.
Regards,
Rahul
The problem is not completely resolved at this point.
The customer confirmed the DSP is operating at 600MHz.
The main objective of the initiative is to process two 12-bit signals from an ADC sampling at 10MSPS. Can we do it with the DSP?
In response to the original questions (directly from the customer):
1. You should try this experiment with standalone DSP code without uboot or Linux and then try to move to muti-core/multi-OS environment.
Currently am flash my DSP core1 using XDS200 USB emulator. Process for this as follow:
=> create target config in code composer studio & launch it to make a connection.
=> Connect A15 0 first and enable clock for DSP1 then disconnect A15 core & connect DSP1 core and load my application.out file.
=> Then run my application & check response on scope.
Note: To execute above steps I need to stop my target device in u-boot before starting Linux on EVM & In Linux mode am using
Remote-proc for this activity.
Is there any other approach to do this with standalone DSP code without u-boot/linux, then please let me know ?
2. Is your code executing from DSP L2 memory or OCMC or DDR. Is the DSP L1/L2 cache enabled?
While connecting cores GEL scripts are executing in background and provides some logs on console of code composer studio.
It showing DSP core using L2 memory.
3. Is there any power management driver enabled in u-boot/Linux that may be putting the DSP in lower power state?
Am not sure about this particular point, but cross check under u-boot it is under normal power state., but in Linux can I know the location of power configuration inside dts file??
Thank you!