This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Boot Basics on OMAP-L138, Part 2

Guru 15580 points

Other Parts Discussed in Thread: OMAP-L138

OK, I'm making progress towards my goal of actually booting my OMPA-L138 on my custom board that includes the Logic SOM1. With help from others on figuring out how to get my code burned into the SPI flash on the SOM, I believe I have that part under control.

Now I am trying to tackle actually programming the ARM to allow the DSP to boot and run it's algorithm, then put the ARM to sleep. With guidance from Christina, and the detailed steps in the OMAP-L138 System Reference Guide (SPRUGM7D–April 2010), I am generating two projects, one for the ARM and one for the DSP. I am heavily utilizing the Logic BSL code for my initialization routines, which seems to be working.

However, I am now stuck trying to implement the following step #3 outlined on page 269 of the guide:

3. Write the truncated DSP boot address vector to the DSP_ISTP_RST_VAL field in the host 1

configuration register (HOST1CFG) of the SYSCFG module. The least-significant bits of the boot

address are fixed at 0.

 

How do I determine what the "truncated DSP boot address vector" is? Is it the base address of my compiled code for my DSP project? I am compiling the DSP code to external DDR memory at 0xc0000000, minus the lower 2 bytes (ie  0xc000)?

Any further guidance would be greately appreciated.

Thx,

MIkeH

 

  • Yes this is where the DSP will start executing code when it comes out of reset. You should set this value to the address of the entry point of your program.

    The field is 22 bits wide, so if your entry point is at 0xC0000000, you would write (0xC0000000 & 0xFFFFFC00) to that register.

    Jeff

  • Jeff,

    Thanks for the guidance.

    Moving on to the next feat, AIS booting, I find the following statement in section 9 of the Using the OMAP-L1x8 Bootloader (SPRAB41C–September 2010) document:

    9 Boot Requirements, Constraints and Default Settings

    • Memory Usage: The bootloader uses 16 KB of Shared RAM starting from 0x80000000 for multiple

    purposes. This memory should not be used by any initialized section of the user application.

    Does this mean that none of my programs can use shared memory from 0x80000000 thru 0x80004000?

    Thx,

    MikeH

     

  • The key phrase is "initialized section." Once the bootloading finishes, that memory space can be used however you want. However if during the section loading you overwrite memory the bootloader itself is using (0x80000000 - 0x80004000), it will not work.

    Jeff

  • Progress!!! Well, partical progress.... I am able to boot the ARM from my SPI flash device!

    Using the Logic BSL, I have written a small test initializaiton program that runs on the ARM and flashes one of my LEDs.


    #include "stdio.h"
    #include "types.h"
    #include "evmomapl138.h"
    #include "evmomapl138_i2c.h"
    #include "evmomapl138_timer.h"
    #include "evmomapl138_gpio.h"

    //-----------------------------------------------------------------------------
    // Private Defines and Macros
    //-----------------------------------------------------------------------------
    #define SYS_BASE           0x01C14000 
    #define HOST1CFG           *(unsigned int*)(SYS_BASE + 0x044)  //DSP HOST1CFG
    #define PINMUX_GPIO_USER_LED_REG   (10)
    #define PINMUX_GPIO_USER_LED_MASK  (0x00FFFF00)
    #define PINMUX_GPIO_USER_LED_VAL   (0x00888800)

    //-----------------------------------------------------------------------------
    // Private Function Prototypes
    //-----------------------------------------------------------------------------
    void Flash_leds(void);
    void Init_leds(void);


    // ====================================    MAIN    ===================================

    void main()
    {
     EVMOMAPL138_init();
        EVMOMAPL138_initRAM();
        EVMOMAPL138_enableDsp();
        HOST1CFG = (0xC0000000 & 0xFFFFFC00);
     USTIMER_init();    
        I2C_init(I2C1, I2C_CLK_400K); 
     Init_leds();
     while(1)Flash_leds();
    }
     
    void Flash_leds(void)
    {
     GPIO_setOutput(4,5,0); //Led #1 ON
     USTIMER_delay(500000); //Wait 1 sec
     GPIO_setOutput(4,5,1);
     USTIMER_delay(500000); //Wait 1 sec

      
    void Init_leds(void)

     EVMOMAPL138_pinmuxConfig(PINMUX_GPIO_USER_LED_REG, PINMUX_GPIO_USER_LED_MASK, PINMUX_GPIO_USER_LED_VAL);
     GPIO_setDir(4,5,0);  //User LED 1
     GPIO_setDir(4,4,0);  //User LED 2
     GPIO_setDir(4,3,0);  //User LED 3
     GPIO_setDir(4,2,0);  //User LED 4
     
     GPIO_setOutput(4,5,1);
     GPIO_setOutput(4,4,1);
     GPIO_setOutput(4,3,1);
     GPIO_setOutput(4,2,1); 
    }

    I can AISGen the above into a boot image, load it with some SPI flash utilities, then boot to SPI flash and the program runs and flashes the LED.

    HOWEVER, when I try to AISGen the above with the following DSP LED flash program, neither program *appears* to be running (no LEDs flashing).

    // Flash LEDs on DSP after ARM initializes peripherals

    #include "stdio.h"
    #include "types.h"
    #include "evmomapl138.h"
    #include "evmomapl138_timer.h"
    #include "evmomapl138_gpio.h"

    void main() //flash LEDs on DSP
    {
     GPIO_setOutput(4,2,0); //Led #1 ON
     USTIMER_delay(500000); //Wait 1 sec
     GPIO_setOutput(4,2,1);
     USTIMER_delay(500000); //Wait 1 sec
    }

    I have done most of the peripheral initialization in the ARM program so that (hopefully) the DSP program just needs to write to the pre-initialized LEDs. However, this is not happening.

    1. How do I debug this?

    2. Have I made the correct assumptions about the GPIO initialization by the ARM?

    Any guidance would be greatly appreciated. I'm almost there.

    Thx,

    MikeH

     

  • The DSP is still asleep after the boot loader finishes loading the program from the SPI flash. You will have to create an ARM program which wakes up the DSP and sets the proper DSP reset vector.

    If you have CCS, try connecting to the DSP and you should see that it is in reset mode.

    Jeff

  • Jeff,

    Sorry, I'm a bit confused (still). As I understand it, the DSP wakes, then enables the ARM, which in turn disables (sleeps) the DSP. The ARM then begins to boot load from SPI flash (in this example). I have loaded two programs in SPI flash. The first program is an ARM program (shown above) which:

    1. Initializes the psc and timers (plls).

    2. Initialzes external DDR so that I can lode the DSP code there.

    3. Initializes the entry address for the DSP

       HOST1CFG = (0xC0000000 & 0xFFFFFC00);

    4. ENABLES the DSP (see below)


    //-----------------------------------------------------------------------------
    // \brief   releases the dsp core from reset.
    //
    // \param   none.
    //
    // \return  none.
    //-----------------------------------------------------------------------------
    void EVMOMAPL138_enableDsp(void)
    {
       // power dsp core.
       EVMOMAPL138_lpscTransition(PSC0, DOMAIN0, LPSC_DSP, PSC_ENABLE);

       // wake up dsp core and release from reset.
       SETBIT(PSC0->MDCTL[LPSC_DSP], LRST);
    }

    5. Initialzes the timer module

    6. Initializes the GPIO for my LEDs.

    7. Then falls into the LED flashing loop.

    I would assume once the above is done the DSP is alive and well and running. But it doesn't appear to be.

    jc said:

    The DSP is still asleep after the boot loader finishes loading the program from the SPI flash. You will have to create an ARM program which wakes up the DSP and sets the proper DSP reset vector.

     

    Unless I'm misunderstanding your response....that is exactly what I have done.

    jc said:
    If you have CCS, try connecting to the DSP and you should see that it is in reset mode.

    Can I do this if the OMAP has booted from SPI flash?

    thx

    MikeH

  • Yes if you are not able to connect to the DSP at this point it means that it did not wake up. Are you able to connect to either the ARM or the DSP via emulation?
    Jeff

  • jc said:
    Are you able to connect to either the ARM or the DSP via emulation?

    Just to make sure we are on the same page, when you say "via emulation" do you mean when I boot to emulator mode? If so, yes, I can connect to either the ARM or the DSP when I boot to emulation modes. (0001 1110 Emulation Debug).  However, after I boot from SPI flash and attempt to connect the debugger says "The target is being held in reset.  The applied reset must be released before progressing." This happens when I attempt to connec to either the ARM and DSP after booting from SPI flash.

  • No I meant connect with an emulator after booting in SPI flash mode.

    So you are seeing the LED controlled by the ARM blinking, yet you are unable to connect to the ARM at this time?

    Jeff

  • By the way, in your code you showed:

        EVMOMAPL138_init();
        EVMOMAPL138_initRAM();
        EVMOMAPL138_enableDsp();
        HOST1CFG = (0xC0000000 & 0xFFFFFC00);

    Are you enabling the DSP before you set up HOST1CFG or have you fixed that?

    Jeff

  • jc said:

    Are you enabling the DSP before you set up HOST1CFG or have you fixed that?

    I fixed that (reversed the two statements).

    Turns out I *can* connect to the arm after a power-on reset and booting from SPI flash. I didn't realize this was possible.

    If appears, for some reason, the DSP is not getting released from reset

    MDCTL15 = 0x00000003
     FORCE = DISABLE (0x0)
     _RESV_2 = ******************
     _RESV_3 = **
     EMUIHBIE = DISABLE (0x0)
     EMURSTIE = DISABLE (0x0)
     LRST = ASSERT (0x0)
     _RESV_7 = ***
     NEXT = ENABLE (0x3)

    Now that I know how to connect to the ARM & DSP after SPI booting, let me work through the code a bit to see if I can spot the problem. I'll be back in touch as soon as I run into another brick wall.

    Thx

    MikeH

     

  • Jeff,

    I can't seem to release the DSP from reset with my ARM code. I've stepped through the code and all appears to be correct, but when looking at the register values in ccs, I can't seem to de-assert the LRST as show below.

    MikeH said:
    ]MDCTL15 = 0x00000003
     FORCE = DISABLE (0x0)
     _RESV_2 = ******************
     _RESV_3 = **
     EMUIHBIE = DISABLE (0x0)
     EMURSTIE = DISABLE (0x0)
     LRST = ASSERT (0x0)
     _RESV_7 = ***
     NEXT = ENABLE (0x3)

    Shouldn't the ARM code be able to deassert this bit?

    MikeH

  •  The GEL file has a function called Wake_DSP. If you use that, are you able to connect to the DSP?

    Jeff

  • Jeff,

    jc said:
    If you use that, are you able to connect to the DSP?

    Just to clarify, I *am* able to connect to the DSP. I am also able to load an .out file into the DSP memory and run the code that turns on the LED. But what (currently) has me confused is the register values for

    MDCTL15 = 0x00000003
     FORCE = DISABLE (0x0)
     _RESV_2 = ******************
     _RESV_3 = **
     EMUIHBIE = DISABLE (0x0)
     EMURSTIE = DISABLE (0x0)
     LRST = ASSERT (0x0)
     _RESV_7 = ***
     NEXT = ENABLE (0x3)

    After running the "dsp enable" functions, I would have expected to see LRST=0x01 instead of 0x0 (asserted). This is what is causing me to believe the DSP is in "reset". But the fact that I can load code and turn on an LED infers otherwise. Does the fact that I am using the emulator connected thru a XDS100v2 USB emulator cause the DSP to "appear" to be in reset when it actually is not?

     And, yes, I have just tried a modified version of the Wake_DSP function in the BSL file,.....to no avail.

    Thx,

    MikeH

     

  • Is your SETBIT function correct?

    Here is another post, that clearly shows the expected values of MDCTL15

    http://e2e.ti.com/support/dsp/omap_applications_processors/f/42/p/83672/300579.aspx

  • Mukul,

    Thanks for chiming in.

    Mukul Bhatnagar said:

    Is your SETBIT function correct?

    Well, I copied it directly from the Logic BSL. Here it is:


    // LPSC Enable Function for ARM or DSP

    void PSC0_LPSC_enableCore(unsigned int PD, unsigned int LPSC_num)
    {
        unsigned int j;
       
        if( (*(unsigned int*)(PSC0_MDSTAT+4 * LPSC_num) & 0x11F) != 0x103 ) {
          *(unsigned int*) (PSC0_MDCTL+4*LPSC_num) = (*(unsigned int*) (PSC0_MDCTL+4*LPSC_num) & 0xFFFFFEE0) | 0x0103;
          PSC0_PTCMD = 0x1<<PD;

          j = 0;
          /*Wait for power state transition to finish*/
          while( (PSC0_PTSTAT & (0x1<<PD) ) !=0) {
            if( j++ > PSC_TIMEOUT ) {
    //          GEL_TextOut("\tPSC0 Enable Core Transition Timeout on Domain %d, LPSC %d\n","Output",1,1,1,PD,LPSC_num);
              break;
            }
          }
         
          j = 0;
          while( (*(unsigned int*)(PSC0_MDSTAT+4 * LPSC_num) & 0x11F) !=0x103) {
            if( j++ > PSC_TIMEOUT ) {
    //          GEL_TextOut("\tPSC0 Enable Core Verify Timeout on Domain %d, LPSC %d\n","Output",1,1,1,PD,LPSC_num);
              break;
            }
          }
        }
    }

    I get no PSC_TIMEOUT errors, so I assume it is functioning correctly. However, after executing this code and then looking at the PSC10_DSP_MDCTL15 registers (in ccs), I get the following:

    MDCTL15 = 0x00000003
     FORCE = DISABLE (0x0)
     _RESV_2 = ******************
     _RESV_3 = **
     EMUIHBIE = DISABLE (0x0)
     EMURSTIE = DISABLE (0x0)
     LRST = ASSERT (0x0)
     _RESV_7 = ***
     NEXT = ENABLE (0x3)

    ....which (I thinkt) tells me that the LRST bit is still = 0 (reset).

  • Gents,

    If you look closely at my register values you'll see that I was looking at PSC10 instead of PSC00. When I actually watch the *correct* register values, I get the following:

    MDCTL15 = 0x00000103
     FORCE = DISABLE (0x0)
     _RESV_2 = ******************
     _RESV_3 = **
     EMUIHBIE = DISABLE (0x0)
     EMURSTIE = DISABLE (0x0)
     LRST = DEASSERT (0x1)
     _RESV_7 = ***
     NEXT = ENABLE (0x3)

    So, now I can:

    1. Boot from SPI flash

    2. Attach to both the ARM and DSP.

    3. See that the ARM code was properly loaded by AIS in shared RAM at 0x80004000, and that the PC is pointing to instructions there (actually at 0x80004354), although it appears to be hung waiting for an lpsc transistion.

    I'll debug this "wait" issue and report back.

    Thanks again for the help!

    MikeH

     

  • MikeH,

    I'm glad you were able to figure out the state of the DSP.  I would recommend looking at MDSTAT15, in addition to MDCTL15. 

    MDCTL15 is the register you program to tell the PSC what the next state the peripheral should transition to.
    MDSTAT15 will tell you the current state of the peripheral.

    --Christina

  • Christina,

    Thanks for the tip. But I can't seem to find these registers in my "view registers" window. Can you point out where to look?

    PSC00DSP
     REV = 0x44825A00
     INTEVAL = 0x00000000
     MERRPR0 = 0x00000000
     MERRCR0 = 0x00000000
     PERRPR = 0x00000000
     PERRCR = 0x00000000
     PTCMD = 0x00000000
     PTSTAT = 0x00000000
     PDSTAT0 = 0x00000301
      _RESV_1 = ********************
      EMUIHB = INHIBIT_OFF (0x0)
      _RESV_3 = ******
      STATE = ON (0x1)
     PDSTAT1 = 0x00000301
      _RESV_1 = ********************
      EMUIHB = INHIBIT_OFF (0x0)
      _RESV_3 = ******
      STATE = ON (0x1)
     PDCTL0 = 0x001FF101
     PDCTL1 = 0x001FF101
     PDCFG0 = 0x0000000D
     PDCFG1 = 0x00000006
     _EDMA3_0_CC = 0x00001E03
     _EDMA3_0_TC0 = 0x00001E03
     _EDMA3_0_TC1 = 0x00001E03
     _EMIFA = 0x00001E03
     _SPI0 = 0x00001E03
     _MMCSD0 = 0x00001E03
     _ARM_INTC = 0x00001E03
     _ARM_RAMROM = 0x00001E03
     _RSVD0 = 0x00001E03
     _UART0 = 0x00001E03
     _SCR0 = 0x00001E03
     _SCR1 = 0x00001E03
     _SCR2 = 0x00001E03
     _RSVD1 = 0x00000A00
     _ARM = 0x00001F03
     _DSP = 0x00011D03
      _RESV_1 = **************
      EMUIHB = DISABLE (0x0)
      EMURST = ENABLE (0x1)
      _RESV_4 = ***
      MCKOUT = ON (0x1)
      MRSTDONE = INCOMPLETE (0x1)
      MRST = DEASSERT (0x1)
      LRSTDONE = NOTDONE (0x0)
      LRST = DEASSERT (0x1)
      _RESV_10 = **
      STATE = ENABLE (0x3)
     MDCTL0 = 0x00000003
     MDCTL1 = 0x00000003
     MDCTL2 = 0x00000003
     MDCTL3 = 0x00000003
     MDCTL4 = 0x00000003
     MDCTL5 = 0x00000003
     MDCTL6 = 0x00000003
     MDCTL7 = 0x00000003
     MDCTL8 = 0x00000003
     MDCTL9 = 0x00000003
     MDCTL10 = 0x00000003
     MDCTL11 = 0x00000003
     MDCTL12 = 0x00000003
     MDCTL13 = 0x00000000
     MDCTL14 = 0x00000103
     MDCTL15 = 0x00000103
      FORCE = DISABLE (0x0)
      _RESV_2 = ******************
      _RESV_3 = **
      EMUIHBIE = DISABLE (0x0)
      EMURSTIE = DISABLE (0x0)
      LRST = DEASSERT (0x1)
      _RESV_7 = ***
      NEXT = ENABLE (0x3)

  • These are MDSTAT0-15

    _EDMA3_0_CC = 0x00001E03
     _EDMA3_0_TC0 = 0x00001E03
     _EDMA3_0_TC1 = 0x00001E03
     _EMIFA = 0x00001E03
     _SPI0 = 0x00001E03
     _MMCSD0 = 0x00001E03
     _ARM_INTC = 0x00001E03
     _ARM_RAMROM = 0x00001E03
     _RSVD0 = 0x00001E03
     _UART0 = 0x00001E03
     _SCR0 = 0x00001E03
     _SCR1 = 0x00001E03
     _SCR2 = 0x00001E03
     _RSVD1 = 0x00000A00
     _ARM = 0x00001F03
     _DSP = 0x00011D03

      _RESV_1 = **************
      EMUIHB = DISABLE (0x0)
      EMURST = ENABLE (0x1)
      _RESV_4 = ***
      MCKOUT = ON (0x1)
      MRSTDONE = INCOMPLETE (0x1)
      MRST = DEASSERT (0x1)
      LRSTDONE = NOTDONE (0x0)
      LRST = DEASSERT (0x1)
      _RESV_10 = **
      STATE = ENABLE (0x3)

    Ideally the same should've been done for MDCTL0-15, so that the user would not go to a user guide to see MDCTL15 corresponds to DSP and MDCTL14 corresponds to ARM and so on.

    Regards

    Mukul

  • Mukul,

    Oh my. How confusing. Thanks.

    MikeH

     

    FYI....I have continued this discussion HERE in  a new thread.