MSPM0G3507: UART Break Signal Generation.

Part Number: MSPM0G3507

Tool/software:

Hello TI Experts,

I am working on the G3507 board and need to generate a UART BREAK signal.
As I understand, the normal UART TX idles at 3.3V, so to generate a break I need to drive the TX line low for a certain duration (around 50 ms).

My intended approach is:

  1. Disable the UART peripheral.

  2. Reconfigure the UART TX pin as a GPIO output and drive it low.

  3. Maintain the low state for ~50 ms.

  4. Reconfigure the pin back to its default UART functionality and re-enable UART.

Could you please share example code or the recommended method to achieve this?

Where I need this I attached below, please refer

Thanks & Regards,
Amara Rakesh

  • I think you can ask the UART to generate a break by setting/clearing LCRH:BRK [Ref TRM (SLAU846B) Table 18-42]. You (still) have to do the timing yourself. If you're using Driverlib, that would be DL_UART_enableLINSendBreak() and DL_UART_disableLINSendBreak().

    It's not quite clear to me from the datasheet [Ref DS (SLASEX6B) Sec 8.23] whether this feature is limited to UART0 ("Extend") only.

  • yeah I am currently using Driverlib and have tried both DL_enableLINSendBreak and DL_disableLINSendBreak, but the results did not align with my expectations. Therefore, I would appreciate your guidance on an alternative approach.

  • I haven't seen a TI example that does switching between Pin Functions (PFs) on the fly, but people have done it. In some cases you can get glitching at the transitions, but for what you're doing that may not matter.

    Using Driverlib, it might resemble:

    DL_GPIO_clearPins(GPIO_UART_0_TX_PORT, GPIO_UART_0_TX_PIN); // Low
    DL_GPIO_enableOutput(GPIO_UART_0_TX_PORT, GPIO_UART_0_TX_PIN); // Output
    DL_GPIO_initDigitalOutput(GPIO_UART_0_IOMUX_TX); // PF=1
    delay_cycles(CPUCLK_FREQ / 20); // Spin for 50ms
    DL_GPIO_initPeripheralOutputFunction(GPIO_UART_0_IOMUX_TX, GPIO_UART_0_IOMUX_TX_FUNC); // PF=UART

    Since the GPIO registers "remember" the settings,  you could probably do the clearPins and enableOutput calls once, at the beginning of your program..

  • I tried both methods, including LIN and the code you shared. Using a multimeter, I observed that the UART_TX pin goes low and then returns high, which is fine. However, it still sends 0x00 on the TX line, which I don’t want. How can I avoid this? I have attached the code and output for your reference.

    void uart_break_signal()
    {
     
     
        /*Metho-1*/
        #if 0
         DL_UART_enableLINSendBreak(UART_1_INST);         
         delay_cycles(8000000);
         DL_UART_disableLINSendBreak(UART_1_INST);
    
        #endif
    
    
    
        /*Method -2*/
        #if 1
        DL_GPIO_clearPins(GPIO_UART_1_TX_PORT, GPIO_UART_1_TX_PIN);
        DL_GPIO_enableOutput(GPIO_UART_1_TX_PORT, GPIO_UART_1_TX_PIN);
    
        DL_GPIO_initDigitalOutput(GPIO_UART_1_IOMUX_TX);
    
        delay_cycles(CPUCLK_FREQ/2);
    
        DL_GPIO_initPeripheralOutputFunction(GPIO_UART_1_IOMUX_TX, GPIO_UART_1_IOMUX_TX_FUNC);
                                            
        #endif
    
    
        uint8_t packet_list []= {
         0xaa, 0x00, 0x0f, 0x2a, 0x28, 0x00, 0x00, 0x00,
         0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
         0x00
        };
        
        Printf_iwrl1432(packet_list,17);
    
    }

    Regards,
    Amara

  • What is at the other end of the wire? (Presumably the agent that's producing the display.) I've seen UARTs that don't recognize Break but report it (along with a junk byte) as a Framing Error.

    I tried inserting your Method 1 (LIN) into a copy of "uart_echo_interrupts_standby", between the Receive and Transmit calls. I shortened the break time to 2ms (2x byte-times) and inserted a 10usec delay after the Break; these were to make the scope trace easier to read. [I also removed the SleepOnExit call and changed the LPM policy to SLEEP0.]

    I see the break and the echoed byte; I don't see a stray (properly framed) 0x00 byte:

    Neither Putty nor Teraterm see the break per se; I suppose that either they ignore it or it gets lost in the USB somewhere.

    The actual sequence:

            case DL_UART_MAIN_IIDX_RX:
                DL_GPIO_togglePins(GPIO_LEDS_PORT,
                    GPIO_LEDS_USER_LED_1_PIN | GPIO_LEDS_USER_TEST_PIN);
                gEchoData = DL_UART_Main_receiveData(UART_0_INST);
    #define SEND_BREAK 1
    #if SEND_BREAK
                DL_UART_enableLINSendBreak(UART_0_INST);
                delay_cycles(2*CPUCLK_FREQ/1000); // 2x character times at 9600bps
                DL_UART_disableLINSendBreak(UART_0_INST);
                delay_cycles(CPUCLK_FREQ/100000UL); // 10usec for the scope
    #endif // SEND_BREAK
                DL_UART_Main_transmitData(UART_0_INST, gEchoData);
                break;

  • Hi Amara,

    Just want to mention that, must strictly follow the PF switching steps in the TRM as shown below. Disable the current IO function before changing the PF, otherwise, abnormal situations will occur.

    For example, if you want to switch the UART to GPIO, please follow the two steps as below to completely turn off UART and then switch PF filed to GPIO function.

    Best Regards,
    Peter

  • yeah peter, I faced unexpected behavior where the UART was not receiving data and the data index was shifting. After adding the disable code based on your reference, the code is now working as expected. My requirement was UART → GPIO → UART, and with your code reference I achieved it

        DL_UART_Main_disable(UART_1_INST);
        DL_UART_Main_disablePower(UART_1_INST);
    
      
        DL_GPIO_clearPins(GPIO_UART_1_TX_PORT, GPIO_UART_1_TX_PIN);
        DL_GPIO_enableOutput(GPIO_UART_1_TX_PORT, GPIO_UART_1_TX_PIN);
      
        DL_GPIO_initDigitalOutput(GPIO_UART_1_IOMUX_TX);
    
        delay_cycles(CPUCLK_FREQ/2);
        
        DL_GPIO_initPeripheralOutputFunction(GPIO_UART_1_IOMUX_TX, GPIO_UART_1_IOMUX_TX_FUNC);
    
        DL_UART_Main_enablePower(UART_1_INST);
        DL_UART_Main_enable(UART_1_INST);
        SYSCFG_DL_UART_1_init();
    .

    Thanks,

    Amara

  • Glad to hear the issue has been solved, since that, I will close the thread, feel free to submit the post if any new question, thanks!

    Best Regards,
    Peter