This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EVMOMAP-L137 flash boot problem

Other Parts Discussed in Thread: OMAP-L137, CCSTUDIO

Dear all,

I'm trying to boot the EVMOMAP-L137 from the SPI flash memory.

I have follow the steps in here: processors.wiki.ti.com/.../Boot_Images_for_OMAP-L137

There is a sample code that implements a blinker: the DSP enable the ARM, then the ARM turns on and off some leds in the board.

This sample code works fine. I'm able to download it to the board (I use HexAIS tool to generate the .bin file, then sfh_OMAP-L137 to download the flash file). After changing the SW2 position and turning on the board, OMAP-L137 boots correctly and I see the leds blinking.

Now my problem: I added a few lines to the DSP code in order to control the UART. The code is presented as follows:

#include "device.h"
#include "evmomapl137_uart.h"

#define KICK0Ra             *(unsigned int*)(SYS_BASE + 0x038)
#define KICK1Ra             *(unsigned int*)(SYS_BASE + 0x03c)

void main(void)
{
    UART_Handle uart0;
    // Open Permissions to SYSCFG Registers (Not required for PG2.0 silicon and above)
    KICK0Ra = 0x83e70b13;
    KICK1Ra = 0x95A4F1E0;

    DEVICE_enable_ARM();

    uart0 = EVMOMAPL137_UART_open( 2, 1248 ); // This work for baud=9600. baud*24000000*13/(16*150000000)

    while(1){
        while( EVMOMAPL137_UART_xmtReady( uart0 ) );  // Wait for uart_tx ready
        EVMOMAPL137_UART_putChar( uart0, 'A' );    // Write 1 byte
    }
}

When calling EVMOMAPL137_UART_open I pass as parameter 1248 instead of 9600 following the discusions in here.
             e2e.ti.com/.../964737
             e2e.ti.com/.../103027
             e2e.ti.com/.../412104

So, I just enable the ARM, then I send continuously 'A' through the serial port. When I lunch the project from ccs3.3 everything works ok: I see the leds blinking and receive continuously 'A' in my PC.

However, when the code is downloaded to the SPI flash, and the OMAP-L137 boots from it, I can see the leds blinking, but I don't receive anything in my PC (nothing comes out from the UART, I checked it with an oscilloscope).

This is the DSP linker.cmd file (I have just added the path to evmomapl137bsl.lib with respect to the original file):

-lrts64plus.lib
-l C:\CCStudio_v3.3\boards\evmomapl137_v1\dsp\lib\evmomapl137bsl.lib

-stack          0x00001000 /* Stack Size */ 

MEMORY
{
    L2RAM        org=0x80010000 len=0x00010000 /* L2 RAM/Cache */
}

SECTIONS
{
    .text       > L2RAM
    .const      > L2RAM
    .bss        > L2RAM
    .far        > L2RAM
    .switch        > L2RAM
    .stack      > L2RAM
    .data       > L2RAM
    .cinit        > L2RAM
    .sysmem        > L2RAM
    .cio        > L2RAM
}

This is the ARM linker.cmd file:

/*
 *  Linker command file
 *
 */
-
-stack           0x00000800      /* Stack Size */
-heap            0x00000800      /* Heap Size */


MEMORY
{
    ARMRAM:      o = 0xFFFF0000  l = 0x00002000
    DSPRAM:      o = 0x11800000  l = 0x00040000
    SHAREDRAM:   o = 0x80000000  l = 0x00020000
    SDRAM:       o = 0xC0000000  l = 0x20000000
}

SECTIONS
{
    .bss        >   SHAREDRAM
    .cinit      >   SHAREDRAM
    .cio        >   SHAREDRAM
    .const      >   SHAREDRAM
    .stack      >   SHAREDRAM
    .sysmem     >   SHAREDRAM
    .text       >   SHAREDRAM
    .switch     >   SHAREDRAM
    .far        >   SHAREDRAM
    .test_buf    >    SDRAM
}

Does anybody have any idea why the code doesn't work when booted from the flash?

  • Hi,

    So, I just enable the ARM, then I send continuously 'A' through the serial port. When I lunch the project from ccs3.3 everything works ok: I see the leds blinking and receive continuously 'A' in my PC.

    However, when the code is downloaded to the SPI flash, and the OMAP-L137 boots from it, I can see the leds blinking, but I don't receive anything in my PC (nothing comes out from the UART, I checked it with an oscilloscope).

    If LED is blinking after flash and UART and LED is working in "debug" mode (CCS) , then this behavior seems, the UART module is in "disable" state, so you have to enable the UART module (PSC1 -> 13) either through code or while HEX conversion.
    BTW, no issues with linker command file.

    You have to enable PSC1 '13' if your are using UART2.
  • Hi Titus,

    If I understood correctly, I have to enable one of the registers of the Power and Sleep Controller? PSC1 '13' you refer to PDCFG1?

    In the GEL file I see the function Setup_Psc_All_On( ), which sets up all power domains. I have adapted it to be excecuted in my DSP C code, but no success so far.

    Thanks,

    Oscar
  • Hi Titus,

    If I understood correctly, I have to enable a domain in the Power and Sleep Controller 1? PSC1 '13', you refer to register PDCFG1?

    In the GEL file I found the function Setup_Psc_All_On( ) which sets up all the power domain. I have adapted it to execute it from DSP main, but no sucess so far with the UART.

    Thanks,

    Oscar
  • Hi Oscar,
    Yes, you are understanding is correct.
    1) Try to debug the code, like, disable the "Setup_Psc_All_On( )" function in gel file and use this "Setup_Psc_All_On( )" in your project code and see, all the PSC modules got enabled through CCS register or memory window.

    Use "Setup_Psc_All_On( )" this function as first line in "main()"

    2) Check the MDSTAT13.STATE bits in code whether it got enabled or not and make some LED blink if PSC13 is in enable state.

    3) You can also use the following code.

    static void enable_module_clocks (void)
    {
    modulesEnabled = FALSE;

    // Ensure previous initiated transitions have finished
    if(check_psc_transition(CSL_PSC_1) == pscTimeout)
    return;

    // Enable peripherals; Initiate transition
    CSL_FINST(psc1Regs->MDCTL[CSL_PSC_UART2], PSC_MDCTL_NEXT, ENABLE);
    CSL_FINST(psc1Regs->PTCMD, PSC_PTCMD_GO0, SET);

    // Ensure previous initiated transitions have finished
    if(check_psc_transition(CSL_PSC_1) == pscTimeout)
    return;

    // Ensure modules enabled
    if(check_psc_MDSTAT(CSL_PSC_1, CSL_PSC_UART2, CSL_PSC_MDSTAT_STATE_ENABLE)
    == pscTimeout)
    return;

    modulesEnabled = TRUE;
    }/* enable_module_clocks */
  • Hi Titus,

    Thanks for your help. I was able to solve the problem. It was comming from PSC1 -> 13 and from a PINMUX register.

    This is the code:

    #define MDSTAT13 *(unsigned int*)(0x01E27834)

    void main(void)
    {
    UART_Handle uart0;
    // Open Permissions to SYSCFG Registers (Not required for PG2.0 silicon and above)
    KICK0Ra = 0x83e70b13; // Kick0 register + data (unlock)
    KICK1Ra = 0x95a4f1e0; // Kick1 register + data (unlock)

    init_DSP();
    OnTargetConnect( );

    if((MDSTAT13&0x3F) == 3)
    DEVICE_enable_ARM();

    uart0 = EVMOMAPL137_UART_open( 2, 1536 ); // This work for baud=9600. baud*24000000/(150000000)

    while(1){
    while( EVMOMAPL137_UART_xmtReady( uart0 ) ); // Wait for uart_tx ready
    EVMOMAPL137_UART_putChar( uart0, 'A' ); // Write 1 byte
    }
    }

    In init_DSP() I initialize PINMUX.

    I took the function OnTargetConnect( ) from the GEL file. I commented out Setup_System_Config( ) and Enable_ARM() as follows:

    void OnTargetConnect( )
    {
    //GEL_TextOut( "\nomap-l137 DSP Startup Sequence\n\n" );

    // Setup_System_Config( ); // Setup Pin Mux and other system module registers
    Setup_PLL(); // Setup PLL0 (300MHZ ARM, 300MHz DSP, 133MHz EMIFs

    Setup_Psc_All_On( ); // Setup All Power Domains

    Setup_EMIFA(); // Async EMIF
    Setup_EMIFB(); // Setup SDRAM

    // Enable_ARM();

    //GEL_TextOut( "\nStartup Complete.\n\n" );
    }

    Thanks,

    Oscar
  • Hi Oscar,
    Sounds good.
    Thanks for your code and it could help other community members as well.
    Once again thanks for your update.