This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP432E411Y-BGAEVM: My UART6 RX EOT interrupt is too slow

Part Number: MSP432E411Y-BGAEVM

Hey guys,
I'm working on a project where I'm supposed to implement the communication with a sensor over RS-485 connected to UART using Modbus-RTU. I have implemented the part where I'm sending data to the sensor, but I have problems switching the DE/nRE, which is connected to a GPIO PH4 pin. I'm trying to use the UART_TXINT_MODE_EOT interrupt, which according to the documentation should be raised immediately after the last bit of the last byte is sent, to change the direction of communication once the data is sent and the interrupt does get raised, but it's too slow so I miss the sensor's response. I did some rough measurements using timers and it seems that the interrupt is raised around 30ms too late.

Here's the part of the code where it's supposed to happen. Can anyone help me figure out why it isn't working as I thought it would?

https://pastebin.com/gXpHHrZf

Thanks

  • I ran this code on an MSP432E401Y Launchpad and my scope shows a high pulse on the enable pin of about 9.2 ms. This is about what I would expect -- 8.3ms for the 8 UART bytes, plus some slop.

    How did you measure?

    Porting notes:

    1) moved the enable pin to PH0 (no PH4)

    2) enabled PORTP (SysCtlPeripheralEnable)

    3) removed the explicit include of msp432e411y.h

    [Edit: Fixed typo.]

  • Hi, thanks for looking into this, and sorry for the late reply. I measured it using a timer and TimerValueGet() function and the process looked somewhat like this:

    1. start the timer
    2. save timer value to timer_before
    3. send the data
    4. save timer value to timer_after

    with a line of code in the sense of "save timer value to timer_int" as the first line of the interrupt handler.

    Anyway, what that got me was:
    timer_before = 808
    timer_after = 1238
    timer_int = 6,628,182

    which led me to believe that it took over 6 million cycles for the interrupt to occur. I am still learning though, so there's a good chance my methodology isn't the best.

    However, I also checked using an oscilloscope connected to the TX, RX, and DE/nRE wires, which showed that the data from the board to the sensor is sent, then there's a short pause and the sensor sends back a response, but the enable pin goes low even later after that so I miss the incoming data. I don't have the exact numbers available though.

  • I did what I believe is the same thing, and I get timer_before = 119999972, timer_after = 118899120, for a difference of 1100852. I left the timer prescaler=1, so it was counting at 120MHz.  My calculator says 1100852/120MHz = 0.009174 seconds i.e. about 9.2ms.

    [Edit: On re-reading I realized that what I called timer_after corresponds with timer_int in your case.]

    My scope indicates a delay of about 100us between the final Tx rising edge (stop bits) and the falling edge of the enable. (I don't know whether EOT is defined to trigger after the final data bit or after the stop bit(s).)

    Just for clarity, here's what I was working with. Maybe you can see what's different.

    ///
    #define BMC     1
    #define TIMERS  1
    #include <stdio.h>
    #include <stdlib.h>
    #include "ti/devices/msp432e4/driverlib/driverlib.h"
    #include <ti/devices/msp432e4/driverlib/interrupt.h>
    #include <ti/devices/msp432e4/driverlib/uart.h>
    #include <ti/drivers/uart/UARTMSP432E4.h>
    #include <ti/devices/msp432e4/driverlib/gpio.h>
    #include <ti/devices/msp432e4/driverlib/sysctl.h>
    #include <ti/devices/msp432e4/driverlib/pin_map.h>
    #if !BMC
    #include <ti/devices/msp432e4/inc/msp432e411y.h>
    #endif
    
    volatile unsigned char buffer[256] = { { 0 } };
    volatile unsigned int buffer_position = 0;
    #if BMC // BMC
    #define ENBIT GPIO_PIN_0    // No PH4 on E401Y
    #else
    #define ENBIT GPIO_PIN_4
    #endif
    #if TIMERS
    volatile uint32_t time_before, time_after;
    #endif
    /* Incoming byte at UART6 interrupt handler. */
    void UART6_IRQHandler(void) {
        uint32_t ui32Status;
    #if TIMERS
        time_after = TimerValueGet(TIMER0_BASE, TIMER_A);
    #endif // TIMERS
        // Get the interrupt status.
        ui32Status = UARTIntStatus(UART6_BASE, true);
    
        // Clear the asserted interrupts.
        UARTIntClear(UART6_BASE, ui32Status);
    
        if (ui32Status & UART_INT_TX) { // end-of-transmission interrupt
            GPIOPinWrite(GPIO_PORTH_BASE, ENBIT, 0x00);
        }
        while (UARTCharsAvail(UART6_BASE)) {
            buffer[buffer_position++] = UARTCharGet(UART6_BASE);
        }
    }
    
    int main(void) {
        uint32_t ui32SysClock;
        volatile uint32_t ui32Loop;
    
        // Run from the PLL at 120 MHz.
        ui32SysClock = SysCtlClockFreqSet((SYSCTL_XTAL_25MHZ | SYSCTL_OSC_MAIN | SYSCTL_USE_PLL | SYSCTL_CFG_VCO_480),
                                          120000000);
    
    #if BMC // BMC
        SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOP);
        while (!SysCtlPeripheralReady(SYSCTL_PERIPH_GPIOP)) {
        }
    #endif // BMC
        // Initialise UART 6.
        SysCtlPeripheralEnable(SYSCTL_PERIPH_UART6);
        IntMasterEnable();
    
        GPIOPinConfigure(GPIO_PP0_U6RX);
        GPIOPinConfigure(GPIO_PP1_U6TX);
        GPIOPinTypeUART(GPIO_PORTP_BASE, GPIO_PIN_0 | GPIO_PIN_1);
        UARTConfigSetExpClk(UART6_BASE, ui32SysClock, 9600,
                            (UART_CONFIG_WLEN_8 | UART_CONFIG_STOP_ONE | UART_CONFIG_PAR_EVEN));
        IntEnable(INT_UART6);
        UARTIntEnable(UART6_BASE, UART_INT_RX | UART_INT_TX);
        UARTTxIntModeSet(UART6_BASE, UART_TXINT_MODE_EOT);
    
        // Initialise GPIO pin.
        SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOH);
        while (!SysCtlPeripheralReady(SYSCTL_PERIPH_GPIOH)) {
        }
        GPIOPinTypeGPIOOutput(GPIO_PORTH_BASE, ENBIT);
        // Send message.
        uint8_t data[8] = { 0x01, 0x04, 0x00, 0x00, 0x00, 0x05, 0x30, 0x09 };
    #if TIMERS
        MAP_SysCtlPeripheralEnable(SYSCTL_PERIPH_TIMER0);
        MAP_TimerConfigure(TIMER0_BASE, TIMER_CFG_PERIODIC);
        MAP_TimerLoadSet(TIMER0_BASE, TIMER_A, ui32SysClock); // 1 second for no good reason
        MAP_TimerEnable(TIMER0_BASE, TIMER_A);
        time_before = TimerValueGet(TIMER0_BASE, TIMER_A);
    #endif // TIMERS
        GPIOPinWrite(GPIO_PORTH_BASE, ENBIT, 0xFF);
        int i;
        for (i = 0; i < 8; ++i) {
            UARTCharPutNonBlocking(UART6_BASE, data[i]);
        }
    
        while (1) {
        }
    }
    

    [Edit: Fixed typo.]

  • Hi, sorry for the late reply... again. Unfortunately, I don't see any differences apart from porting to your board. I'm thinking maybe this could be a hardware issue since it works as expected in your case.

  • My thought was that you could try this code on your board (re-define BMC=0) to see if you get the same results. If you do, that points one direction and if you don't that points in a different direction.

    I still haven't figured out how your original code succeeds without enabling Port P (SYSCTL_PERIPH_GPIOP). When I tried it the program crashed immediately in the subsequent GPIOPinConfigure(). (That isn't the symptom you described, but maybe there's a clue there.)

  • So, I think I've figured it out.

    You were right, my code shouldn't be working without enabling Port P and it wasn't. The code I initially posted wasn't quite complete and was a cleaned-up version of the whole thing because I didn't realize the issue could be anywhere else other than in the interrupt setup. I did test what I posted before I posted it, but I only focused on whether it works or not and didn't realize it might not work for some other reason. In reality, though, I am using the Kentec K350QVG display and I have it connected to the board. As part of the display initialization, Port P is enabled, so even though I didn't enable it explicitly it worked as expected.

    Another thing that I originally had in the code was a printf() statement containing some text and a newline character that I used directly after sending the data. I'm not quite sure how exactly it affected the UART communication, but once I deleted the newline character, the problem is gone. However the string doesn't appear in the console, so I guess the flush only happens on a newline character, and that somehow interfered with the UART communication.

    So now it sends the data correctly and I get some data in return. I'm going to test it out a bit more next time I get my hand on a scope, but it seems to be working correctly now. Apart from one thing and that is that I'm supposed to be receiving 15 bytes of data, but I only get the first 8, but it's solid progress nonetheless.

    Anyway, sorry for the time you spent on this, I should've posted the whole thing in the first place. Thank you though, you've really helped me.

    By the way, if you have any idea why I'm only receiving 8 bytes out of 15, I'd appreciate a hint. I haven't looked into it much yet, but I'm a beginner so any advice is appreciated.

  • I'm glad you got it working.

    The code you posted only sends 8 bytes. How are the other 7 sent?

  • Yes, I'm sending 8, but the sensor should send back 15. Sorry if it was unclear.

  • The Rx interrupt triggers at 4/8=8 bytes unless you set it otherwise, so I suspect the other 7 bytes are stuck in the FIFO. [Ref TRM (SLAU723A) Table 26-12 (RXIFLSEL).]

    Try setting the Rx FIFO trigger level lower, maybe something like:

    > UARTFIFOLevelSet(UART6_BASE, UART_FIFO_TX4_8, UART_FIFO_RX1_8); // Tx trigger 4/8=8 bytes, Rx trigger 1/8=2 bytes

    You'll probably still have to poll for the last (odd-numbered) byte, but this should demonstrate whether it's the FIFO you're fighting.

    [Edit: You might also get some use out of the Receive Time-Out feature. I haven't used it but it appears to trigger after ~3 bytes of idle time. [Ref TRM Sec 26.3.9].]

  • Hi, I changed the FIFO trigger level, and it worked exactly as you said. The last byte was still stuck in FIFO, but luckily the receive time-out interrupt solved that for me.

    Moving on with my project, though, I realized that I didn't consider that the receive interrupt wouldn't get triggered on every single byte, which is a problem for me as I need to check the delay between each incoming byte for error checking (it can be 1.5 character times max). I wanted to do that using a timer, with the 1.5 character time set, and resetting it each time I received a byte.

    I solved that problem by disabling the FIFO completely, as the TRM Sec 26.3.8 states that with FIFO disabled, it just acts as a 1-byte holding register, so essentially a FIFO with a length of 1 byte. Now I get an interrupt on every single incoming byte.

    With that, I believe that this thread can be closed. Thank you for all your help.