This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP432P401R: BSL does not work (anymore) if memory not blank

Part Number: MSP432P401R

Hello all, I made a firmware capable to auto-reprogram itself on request, by using the 'software invocation' BSL feature (as described in SLAU622J, par.3.3.1). From the PC side, I run the BSL scripter v.3.4.0.1. For a while, all worked as expected.

Lately, I've found that this system does not work anymore (BSL seems not responding at all) if the MSP432 firmware has not previously been erased in some other manner (e.g. uniflash). Said that

1) I use UART (eUSCI-A0 connected to PC via a FT232 usb device), and tried to call BSL_ENTRY_FUNCTION with both auto (0xFC48 FFFF) and manual (0xFC48 DFFF) configuration parameters;

2) I have tried by reloading original firmware in which BSL first worked,

3) then tried by running a minimal firmware whose purpose is only to invoke BSL (to exclude that some peripheral - irq - etc used in the 'real' firmware could somewhat 'disturb');

4) I tried also to reprogram a factory BSL (BSL432_MSP432P401) just downloaded from TI site, in the case the native one would be corrupted (but how could it be, if I do not change linker sections, nor write anything to flash in the app?);

5) I proved with three different MCUs (two of them mounted on custom HW, and the third is the launchpad MSP-EXP432P401).

6) even proved on two different PCs (one runs Win10, other Win8.1).

7) in all cases, I do not let pass more than 3 seconds between tha call to BSL_ENTRY_FUNCTION and launch of BSL scripter (I'm aware of bsl timeout).

... have you any clue?

Thanks in advance

Osvaldo

  • Hi 

    Few comments from myside

    1. Check if the NVIC IABR register be cleared before invoking the BSL. like below

    MAP_Interrupt_disableMaster();
    // Setup interrupt priorities into 0x00 before entering bootloader
    for (int i=0; i < 240; i++) NVIC->IP[i] = 0;
    NVIC->ICER[0] = 0xFFFF;
    NVIC->ICPR[0] = 0xFFFF;
    NVIC->ICER[1] = 0xFFFF;
    NVIC->ICPR[1] = 0xFFFF;
    // Call the BSL with given BSL parameters
    ((void (*)())BSL_ENTRY_FUNCTION)((uint32_t)BSL_PARAM);

    2. Download a normal UART demo code from TI to see if the UART interface is works. You can find the demo code from here (You may need to change the pin to the BSL used)

    3. Try hardware invoke to check if there is an error in the software invoke 

  • Thanks Gary for your comments.

    However, I have eventually (maybe) solved the problem, by disabling rtc interrupt just before calling global irq disabling:

    Interrupt_disableInterrupt(INT_RTC_C); // that do the trick!
    Interrupt_disableMaster();

    //... follows code as per SLAU622J, par.3.3.1

    The fact is that in my app RTC module is used as well, and is programmed to trigger an irq every second; however, what I cannot explain is why it is necessary to disable it "manually", otherwise (as it seems to be) BSL behavior is corrupted. In other words, why Interrupt_disableMaster() (that is supposed to prevent CPU from servicing irqs) seems not work properly.

    Osvaldo