This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430FR2155: Setting up BSL in I2C mode from Application code

Part Number: MSP430FR2155
Other Parts Discussed in Thread: MSP-FET

Tool/software:

Hello,

I want to test if my device is entering BSL mode correctly and if the sequence I use is correct in general.
I have added this piece of code in my application:

                sc_stop(); Turns off ADC
                __disable_interrupt(); // disable interrupts
                ((void (*)())0x1000)(); // jump to BSL


Then I want to use MSF_FET in I2C mode and try to access BSL with following script:
LOG
MODE FRxx I2C 100000 COM10
DELAY 2000
//gives wrong password to do mass erase
RX_PASSWORD pass32_wrong.txt
//
//add delay after giving wrong password
//because the device does not give
//any response after wrong password applied
//
DELAY 2000
RX_PASSWORD pass32_default.txt


I connected MSP_FET's Backend UART pins to my Device's I2C. Also connected ground and VCC target.
I'm not getting ACK response on I2C line.

Is my sequence correct or am I doing something wrong ?

  • Hi Serg,
    You need to use the I2C lines from the MSP-FET rather than the UART lines.

    Best Regards,

    Diego Abad

  • Hey, yes. I think I'm using correct ones. I mistakenly mentioned them as UART lines, because I2C hides behind them.
    I use pins 10 and 12 from MSP-FET's JTAG Target Connector.

  • Hi Serg,
    You are correct, those are the correct pins. Can you share a waveform of your I2C communication? I want to take a look at it and see if there's something wrong going on. Also, can you share the part of your code that enables the I2C comm in your application?

    Best Regards,

    Diego Abad

  • Hey, so yesterday I did couple of more tests. Basically, I figured that I2C BSL works when I try to enter it using HW invocation with BSL. I removed I2C init code to test it. Then I enabled jump to BSL from application(I2C still not initialized). Removed TST and RST connection between MSP-FET and device, and tried again. This time it doesn't give an ACK for starting sequence 0x48. I think that indicates that I2C works in general, but something is with activating BSL from app.

  • I figured out one issue. It was jumping to NMI_ISR() from bootloader. 
    I use external oscillator on my setup. So I disabled oscillator fault interrupt and cleared flag before jumping to BSL.
    It seems that now it stays in BSL, and I do get some responses when trying to communicated, but then it stucks.
    I disabled all code that initializes peripherals now, and still same issue.
    Here are code samples:

     // Stop Watchdog Timer. TODO: implement WDG
        MAP_WDT_A_hold(WDT_A_BASE);
    
        MAP_GPIO_setAsInputPinWithPullUpResistor(GPIO_PORT_P1, (GPIO_PIN_ALL8 & ~(GPIO_PIN0 + GPIO_PIN1 + GPIO_PIN2 + GPIO_PIN3)));
        MAP_GPIO_setAsInputPinWithPullUpResistor(GPIO_PORT_P2, GPIO_PIN_ALL8);
        MAP_GPIO_setAsInputPinWithPullUpResistor(GPIO_PORT_P3, GPIO_PIN_ALL8);
        MAP_GPIO_setAsInputPinWithPullUpResistor(GPIO_PORT_P4, GPIO_PIN_ALL8);
        MAP_GPIO_setAsOutputPin(GPIO_PORT_PJ, GPIO_PIN_ALL8);
        MAP_GPIO_setOutputLowOnPin(GPIO_PORT_PJ, GPIO_PIN_ALL8);
    
        // UART RX and TX lines (P4.2 & P4.3)
        // Disable UART module before setting these pins as outputs if UART is not used
        MAP_GPIO_setAsOutputPin(GPIO_PORT_P4, GPIO_PIN2 + GPIO_PIN3);
        MAP_GPIO_setOutputLowOnPin(GPIO_PORT_P4, GPIO_PIN2 + GPIO_PIN3);
    
        clock_init();
    
        LOG_DEBUG_INIT();
    
        /*
         * Disable the GPIO power-on default high-impedance mode to activate
         * previously configured port settings
         */
        MAP_PMM_unlockLPM5();
    
        __enable_interrupt();
    
        while (1)
        {
            next_state = ctl_process(); // Doesn't go furter for this example


    Code for clock_init:
    /**
     * Initializes clock sources for the device.
     * Uses an external clock source for ACLK and DCO for SMCLK and MCLK.
     */
    static void clock_init(void)
    {
        // Configure Pins for XIN and XOUT
        //Set P2.6 and P2.7 as Module Function Input.
        MAP_GPIO_setAsPeripheralModuleFunctionInputPin(
        	GPIO_PORT_P2,
        	GPIO_PIN6 + GPIO_PIN7,
        	GPIO_SECONDARY_MODULE_FUNCTION
        );
    
        //Initializes the XT1 crystal oscillator with no timeout
        //In case of failure, code hangs here.
        //For time-out instead of code hang use CS_turnOnXT1LFWithTimeout()
        MAP_CS_turnOnXT1LF(CS_XT1_DRIVE_0);
        // Initialize the external clock source
        MAP_CS_setExternalClockSource(XT1CLK_FREQUENCY);
    
        // Set the ACLK to use the external clock source (XT1CLK)
        MAP_CS_initClockSignal(CS_ACLK, CS_XT1CLK_SELECT, CS_CLOCK_DIVIDER_1);
    
        // Below settings for different clock speed
        // //Set DCO FLL reference = REFO
        // CS_initClockSignal(
        //     CS_FLLREF,
        //     CS_XT1CLK_SELECT,
        //     CS_CLOCK_DIVIDER_1
        //     );
    
        // CS_initFLL(CS_MCLK_DESIRED_FREQUENCY_IN_KHZ, CS_MCLK_FLLREF_RATIO);
    
    	//clear all OSC fault flag
    	MAP_CS_clearAllOscFlagsWithTimeout(1000);
    
    	//Enable oscillator fault interrupt
        MAP_SFR_enableInterrupt(SFR_OSCILLATOR_FAULT_INTERRUPT);
    
        // Set SMCLK = DCO with frequency divider of 1
        MAP_CS_initClockSignal(CS_SMCLK, CS_DCOCLKDIV_SELECT, CS_CLOCK_DIVIDER_1);
        // Set MCLK = DCO with frequency divider of 1
        MAP_CS_initClockSignal(CS_MCLK, CS_DCOCLKDIV_SELECT, CS_CLOCK_DIVIDER_1);
    }



    Code for ctl_process():
    ctl_process_result_t ctl_process(void) {
        ctl_process_result_t process_result = CTL_PROCESS_NEXT;
        switch (current_state) {
            case INIT:
                // sc_start();
                current_state++;
                break;
            case OPERATING:
                LOG_INFO_MSG("OPERATING");
                // process_operating();
                if (current_state == OPERATING)
                {
    //                sc_stop(); // Turn off ADC
    //                dpt_ctl_deinit(); // Deinitialize the dpt control
                    __disable_interrupt(); // disable interrupts
    
                    MAP_SFR_disableInterrupt(SFR_OSCILLATOR_FAULT_INTERRUPT);
                    MAP_CS_clearAllOscFlagsWithTimeout(1000);
                    DELAY_MS(5);
                    ((void (*)())0x1000)(); // jump to BSL
                    
                }


    using BSL with following command:
    C:\ti\BSL-Scripter\BSL-Scripter.exe -d -i [COM10,I2C,100000] -n FRxx -b .\ScriptExampleWindows\FRxx_i2c\pass32_default.txt
    Verbose is turned on!
    Device : FRxx
    Init communication parameters : [COM10,I2C,100000]
    RX_PASSWORD .\ScriptExampleWindows\FRxx_i2c\pass32_default.txt
            Read Txt File  : C:\ti\BSL-Scripter\ScriptExampleWindows/FRxx_i2c/pass32_default.txt
            [80] [21] [00] [11] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff]
            [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff] [ff]
            [ff] [ff] [ff] [ff] [9e] [e6]
    
    PS C:\ti\BSL-Scripter>

    It should respond with ERROR: Wrong password.

    I2C behaviour from LA:


    NACK  behaviour zoomed in:

  • Hi Serg,
    I would recommend trying the MSP430 SDK example I2C BSL and see if it makes any difference. Also, are you available to use external pull-ups?

    Best Regards,

    Diego Abad

  • Can you refer me to this example? I couldn't find any examples for MSP430 devices in SDK. I use MSP430 Ware v. 3.80.14.01. Was checking is resource explorer CCS.

  • Hi Serg,
    It should be in the boot_loader folder inside the MSP430 SDK. I also recommend taking a look at the MSP430 FRAM Devices Bootloader (BSL) 

    Best Regards,

    Diego Abad

  • Hey, I figured it out. MSP-FET was not put into BSL mode. Opening Backend UART port with baud 9620, and then using BSL scripter worked. What confuses me the most is how did it work without it BSL activation, when using HW invocation method. 

**Attention** This is a public forum