This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F5435 BSL Programming

Other Parts Discussed in Thread: MSP430F5435, MSP430F2131

I am starting with the MSP430F5435, and I have problem with BSL Programming. Device does not respond correctly.

I try programming the MSP430F5435 from other MCU. My algorithm is as below:

1) Connect GND and VDD (3V3)

2) Send  BSL Entry Sequence as picture below.

3) Send 0x80 to synchronization.

4) Read acknowledge from MSP430F5435.  ACK is wrong. Baud rate is not 9600, DATA_ACK is not 0x90!

Am I missing something?

Thank in advance for your reply :-)

 

L.

 

  • First, could you pleas cut down your tag list? Tagging everything that comes in mind makes the tagging pretty much useless. Most of the tags have nothing to do with your question. [edit] Thanks for reducing the tags. Still I think the 'FLASH programming' tag is a bit misleading as this rather imples a thread focused on programming the flash controller for runtime flash reporgramming. Just "BSL" or "BSL access" or "54xx BSL" would be more 'telling', but well, it isn't that important. It's just that having the tag list filled with all thinkable combinations of words isn't very desireable and counteracts the purpose of the tags.

    The BSL entry sequence seems to be okay, as the device is responding (even if not the expected response).

    I wonder why TX (P1.1) is low at first. It should be high during reset (high-impedance input), but this might be a misinterpretation of the high-impedance by the Oszi.

    If the raising edge of the TX indicates the initialisation of the port hardware by the BSL, maybe you're sending the 0x80 too fast to be interpreted properly. It may be interpreted as 0xc0, 0xe0, 0x0f or whatever (the first 0-bits are not detected). Or if the BSL deliberately tries to interpret it as 0x80, a wrong baudrate is detected.

    The device response could also be interpreted as 06 E6 with 9600Bd as well as 0x51 with 4800Bd.

  • Dear Mr. Jens-Michael Gross  thank for your help!

    This sequence works for older devices ( MSP430F2131) but not works for MSP430F5435.

    In my scheme TX is connected to input of ther MCU with pull-up 4k7. And TX is high during reset.

    On the picture TX is float (disconected ). Only Logic analyzer is connected to this pin. But it nothing change.

    I try also change length of low pulse (0x80), and result also not good. If low pulse is less then 1ms - device not respond. If low pulse is larger then respond will be the same (0x51). 

    According to specification (slau319a) the baud rate is fixed to 9600 baud in half-duplex mode after BSL entry sequence.

     

     

     

     


  • Andrii Vykliuk said:
    On the picture TX is float (disconected ). Only Logic analyzer is connected to this pin. But it nothing change.

    That's what it looked like :)

    Anyway, it shows the moment the T Xline is 'activated' by the BSL. Which also seems to indicate the moment the BSL starts to initialize its hardware. Which may still take some microseconds from this point.

    Andrii Vykliuk said:
    I try also change length of low pulse (0x80), and result also not good. If low pulse is less then 1ms - device not respond. If low pulse is larger then respond will be the same (0x51). 

    It's not enlarging the pulse. It's delaying the pulse (and therefore the falling edge) from teh moment where the BSL initializes its hardware. Just add another delay before you start sending the sync.

    Andrii Vykliuk said:
    According to specification (slau319a) the baud rate is fixed to 9600 baud in half-duplex mode after BSL entry sequence.

    Yes. But what is 9600 baud? It is what the MSP thinks is 9600 baud. To do anything that is timing-related, the code needs a timebase. It is possible that the timebase is faulty. The DCO (which I think is the default timebase)  has a large variation. I don't know how the BSL works and maybe I'm completely off, but if the timer-based BSL uses the DCO and/or the reference clock for doing the timing. It takes up to 32 ms before the DCO has settled to the final frequency based on FLL and 32k reference clock (and then there is still an error of some %, but it is usually good enough for even 115200 Bd). If you access the BSL before the timebase has settled, its timebase may be completely off, resulting a anything being detected or sent.

    Other MSPs don't use FLL but use factory-programmed calibration settings for the DCO. So copying these values to the DCO registers will almost instantly give the desired DCO frequency. Less table and correct, but faster.

    To be sure, add ~32ms delay before starting the conversation.