This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP432E401Y: I2C transmit to TMP117 under uDMA control

Part Number: MSP432E401Y
Other Parts Discussed in Thread: TMP117

This is  a continuation of a previous thread that was put on the back burner nine months ago and I'm finally getting around to looking at again.  My system uses an MSP432E401Y to control and read data from a number of different devices over SPI and UART interfaces.  I currently read from the SPI devices under CPU control but the UART is slow and I can't afford to stall to CPU during the transfer so I use the uDMA to move the data from the sensor into internal memory.

I need to add a temperature sensor to the system and I'm considering the TMP117 which has an I2C interface.  I've purchased the TMP117 EVM and have blue-wired it onto my MSP432 board for testing and code development.  The I2C bus is also too slow to perform the transfers via CPU so I'd like to use the uDMA as well.  I've created a very simple project with the goal of taking baby steps to get to a fully functioning solution.  Here is the code for the main() routine:

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
#include <ti/devices/msp432e4/driverlib/driverlib.h>
#include "common.h"
void main(void)
{
uint32_t g_ui32SysClock; // The system clock frequency
int32_t status;
tempReady = false;
g_ui32SysClock = init_CLK(); // configure system clock
init_GPIO(); // configure the device pins
int_disable();
status = init_temp_I2C(g_ui32SysClock); // configure temperature sensor I2C interface
status = init_temperature(); // initialize temperature driver parameters
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

My first step was to write configuration and read/write functions using the CPU.  First the init_temp_I2C() function enables the I2C peripheral, then init_temperature() configures the TMP117 and starts the data capture.  When the TEMP_DRDY interrupt fires the tempReadyFxn() ISR reads the data.  Finally the tempCallBack() callback sets the tempReady semaphore when the I2C tranfer is complete.  Using this code I can successfully capture data from the TMP117.

My next step is to use the uDMA to capture the data after the I2C, TMP117 and uDMA are initialized by the CPU. I have used the i2c_master_dma_fifo and i2c_mastermode_fifodma_transfer examples from the Resource Explorer as guides.

Here is a diagram explaining the data transfer protocol for the TMP117:

The grey boxes show when the master (MSP432) drives the SDA line and the white boxes when the TMP117 drives the line.

My data transfer is simpler than what is shown because I am always reading from the same register address, I don't need to send the register address with every transfer.  Instead the MSP432 only needs to send a single byte over SDA and then the TMP117 will reply with two bytes.  Because there is such little data transferred I don't think I need to use the FIFO.  The TMP117 is sampling at a 500mS rate, but the MSP432 is only recording temperature data at a 1S rate so I don't mind if I overwrite data that hasn't been recorded yet.

The attached project file shows the current state of my testing.  In this very simple code I've use init_temp_I2C() and init_temperature() to initialize the I2C peripheral and TMP117.  Next I am attempting to enable and configure the uDMA peripheral and send a single byte out the I2C bus.  The intent of the function ConfigureuDMATX() is to map uDMA channel 23 to I2C5 and initialize a single byte tranfser from memory pointed to by &sendMasterTxData to the I2C5 master data register which I then expect to transfmit on I2C5SDA.

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
void ConfigureuDMATX(void)
{
MAP_SysCtlPeripheralEnable(SYSCTL_PERIPH_UDMA);
while(!(MAP_SysCtlPeripheralReady(SYSCTL_PERIPH_UDMA))){}
MAP_uDMAEnable();
MAP_uDMAControlBaseSet(pui8ControlTable);
MAP_uDMAChannelAssign(UDMA_CH23_I2C5TX);
MAP_uDMAChannelAttributeDisable(UDMA_CH23_I2C5TX, UDMA_ATTR_USEBURST |
UDMA_ATTR_ALTSELECT |
UDMA_ATTR_HIGH_PRIORITY |
UDMA_ATTR_REQMASK);
MAP_uDMAChannelControlSet(UDMA_CH23_I2C5TX | UDMA_PRI_SELECT,
UDMA_SIZE_8 | UDMA_SRC_INC_NONE | UDMA_DST_INC_NONE |
UDMA_ARB_1);
MAP_uDMAChannelTransferSet(UDMA_CH23_I2C5TX | UDMA_PRI_SELECT,
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Finally I call MAP_uDMAChannelRequest(UDMA_CH23_I2C5TX); to execute the transfer.

Unfortunately I don't see the transfer on the I2C bus after MAP_uDMAChannelRequest() is executed.

I would appreciate help getting this to work.  I don't know if I've incorrectly configured the uDMA, or if I've missed some critical initialize conde.

Thanks,

-phil

i2c_dma_test.zip

  • Please see the following sections in the MSP432E4 TRM.

    • 19.3.5
    • 19.3.5.2
    • 19.3.6.2

    I recommend you also take a look at the "i2c_mastermode_fifodma_transfer" example found in the MSP432E4 SDK.

    What about " MAP_I2CTxFIFOConfigSet() function" "MAP_I2CMasterIntEnableEx()"

    My advice is that you first make the example code I2C1 work on your device. Then change the I2C channel and DMA channel.

  • Eason,

    As I stated in my question, i2c_mastermode_fifodma_transfer is one of the two examples I am referring to for guidance, please see in my fourth paragraph above.  As I also stated, I'm not using the FIFO.  My I2C interface is already working, as I describe above.  My board does not have I2C1 pins available and I only have a single board so I cannot run the example code.

  • Yes I know. I check your code setup with the example code. The only difference I can observe lies on the fifo setting.

    As you already make it work with I2C and you can't enable I2C1. Can you add the fifo setting on your code first based on I2C5 to see if it can help? The you just make little change on the code compared with our example code.

  • Thanks, I will try using the xmit FIFO and report back.

  • Wait for your good news

  • I've started from the beginning with the i2c_mastermode_fifodma_transfer example rather than trying to make my own custom routines.  It's looking promising but it only works if I set a breakpoint while running the DMA RX functions.  If I let it run without breakpoints then the I2C RX transfers get truncated.

    I'm attempting to simplify and clean up the code then I'll report back here.

  • Eason,

    I've made some hopeful progress.  I started with the i2c_mastermode_fifodma_transfer.c file and modified it to meet my needs.  The biggest change was to remove the FSM to make the code flow easier to follow.

    To capture data from the TMP117 I first have to send a configuration command to register address 0x01.  Then I read data from address 0x00.  Finally, when the TMP117 generates a data ready interrupt I need to do a truncated I2C read from address 0x00 - one which only sends the read slave address then the TMP117 responds with two bytes.

    In my modified code (attached) I first set up the I2C and uDMA for Master transmit. Next a function, send_config(), uses an I2C master FIFO burst send to write the configuration word to address 0x01 (CONFIGURATION). Next it uses an I2C master FIFO single receive to address 0x00 (TEMP_RESULT) to read back the first temperature data word.  That data word is stored in getMasterRxData[] as expected.

    After send_config() completes I enable the GPIO P3 interrupt, which is the TMP117 data ready input.  In the GPIOP3_IRQHandler() ISR I execute a simple I2C Master FIFO single receive to read data from the TMP117.

    Using a logic analyzer I can see the expected activity on the I2C bus.  The image below shows what happens.  The activity on SCL and SDA before T=0 is the activity during the send_config().  The subsequent SCL and SDA activity is from the execution of GPIOP3_IRQHandler().  The GPIO P3 interrupts happen at 500ms intervals, as expected and the MSP432 processes the first four as expected.

    The fifth GPIO P3 interrupt is not completely executed.  The reason is that the I2C RX FIFO fills up after the fourth interrupt so when the I2C reads the ninth data byte it has no room to put it in and execution starts.

    Here is a capture of the send_config() activity, the data 0x0BE7 is stored in getMasterRxData[]:

    And here is the first P3 interrupt activity:

    And here is the truncated fifth P3 interrupt activity:

    My question is, why does the DMA stop moving data from the I2C RX FIFO to getMasterRxData[]?  I think that's the final piece of the puzzle I need to get this working.

     

    i2c_mastermode_fifodma_transfer.c
    Fullscreen
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    /* DriverLib Includes */
    #include <ti/devices/msp432e4/driverlib/driverlib.h>
    /* Standard Includes */
    /* Defines for I2C bus parameters */
    #define SLAVE_ADDRESS 0x48
    #define I2C_NUM_DATA 2
    #define CONFIGURATION 0x01
    #define TEMP_RESULT 0x00
    /* Variables for I2C data and state machine */
    volatile uint8_t sendMasterTxData[I2C_NUM_DATA];
    uint8_t getMasterRxData[I2C_NUM_DATA] = {0};
    const uint8_t setMasterBytesLength = I2C_NUM_DATA;
    /* The control table used by the uDMA controller. This table must be aligned
    * to a 1024 byte boundary. */
    #if defined(__ICCARM__)
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

  • I've answered my own question.  I didn't realize I needed to rewrite the uDMA channel Control Word after every transfer.  The overhead of that operation was unexpected.  I need to minimize the load of the I2C transfer on the CPU and it looks like I can use the I2C receive FIFO told hold data until I can service it.  The FIFO service code takes less time than rewriting the uDMA Control Word so I've decide to stop using the uDMA in favor of the FIFO.

    I'm going to close this ticket as a result.