This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi All,
I know that the topic of I2C for MSP430 was discussed few times but I would like to ask a short and straight question about it. I started a example project in CCS v5 for I2C receiving. And the question is why when the RX interrupt is triggered first time the transmission is at the stage of receiving 6th bit in 2nd byte? I would like to have this interrupt after each byte received and then decide whether or not to send ACK signal. Maybe the example project doesn't show how to do it? How to control exacly the receiving transmission on I2C bus?
Likely not when it is triggered, but when you handle it.Damian Gowor said:why when the RX interrupt is triggered first time the transmission is at the stage of receiving 6th bit in 2nd byte?
When a byte is received, the USCI will immediately set the RXIFG bit, which will trigger an interrupts. However, the USCI continues to receive without delay. And then stops receiving at the the 7th bit of the next byte, if the previous one wasn't read from RXBUF yet.
So likely, what you observe is because you observe:when you hit the breakpoint at the beginning of your ISR, the code execution stops (at this time, you're likely somewhere in the ACK cycle of the first byte). But the USCi continues to receive the first bits of the next byte before it stops. if you toggle a port pin at start of your ISR and look at it with the same scope you use to capture the I2C signals, you'll see that it happens way earlier, around the ACK bit or the first bit of the next byte (depending on I2C clock speed, MSP clock speed and ISR latency)
Thanks for the answer but I think we are missing the inportant point of that. It is not really inportant thing when the breakpoint will stop after the ACK bit of the transmission. The important thing is that the peripheral started to clock the line and receive next byte before you did something inside the interrupt routine. If I would like to get only 1 byte from I2C device then how I can do it? When I am reading from I2C device on each byte it increases its internal memory pointer. So in this case when you want to read a one byte the MSP430 forces to read second byte as well and increases the I2C device memory pointer. Is there any way to turn off the I2C clocking straight after receiving 1 byte and force not to read second one and send STOP signal immediately ??
Yes, that's the meaning of double-buffering. Not waiting until the first event has been handled. So handling of an event and performing the next operation can run concurrently.Damian Gowor said:The important thing is that the peripheral started to clock the line and receive next byte before you did something inside the interrupt routine.
And you're right.: it is impossible to do interrupt-driven reception of one byte only. You'll have to use busy-waiting then:
wait until UCTXSTT clears (interrupts disabled, so you won't miss the moment), then immediately set UCTXSTP, and you'll only receive one byte.
there is no interrupt trigger when the start byte is sent, so you could stop during reception of the first byte. However, this couldn't work reliably anyway. Even if you get the interrupt flag fast enough, the CPU could have interrupts temporarily blocked or executing a different ISR, so you're again too late.
There would have to be a 'receive one byte only' flag in the USCI config to handle this case reliably and without busy-waiting.
The old USART did have this.- But then the usage of I2C was so complex for things like "write address then read register" that it was barely usable at all except for large block transfers.
Thanks for the answer.
I came across few microcontrollers in my lifetime and this is the strangest way of I2C implementation. Unfortunately I don't see any benefits of it and I am really dissapointed how TI solved that. In my opinion the best implementation of I2C was done in Atmels AVRs where you can do everything and fast. But anyway thanks.
I used (and still use) the AVRs too, including I2C. I agree that the AVR I2C is easy to use. However, the MSP approach has some advantages when you look at it from the 'low power' view: The double-buffering allows maximum throughput even with a slow CPU clock. Without this mechanism, you'll have "transfer - CPU action - transfer - CPU action" etc. The way the USCI does it (not perfectly, agreed) allows both to happen simultaneously. Even though this has most benefits for UART or SPI operation, the I2C fits to the already existing hardware concept and structure. If there were only a 'maximum bytes to receive' counter that will automatically do the stop condition (0=plain software control).
Hi Jens,
Thank you for your answer.
What if the transmission itself takes more current than processors core? Generally I have this situation right now. The I2C device takes more power when operating than MSP430. I am trying to put it in sleep mode but because of MSP430 I2C way of working the transmission will take more time. I have to change addresses when writing to it and I cannot write the sequence without reading different register. So the final thing is I have to read 2 bytes more to achieve what I want and the time for reading those two bytes multiplied by current consumption of I2C device is much higher than MSP430 current consumption multiplied by all the time it needs to spend in interrupt routine. Hope I have explained it clearly.
I agree that in some solutions it will take less power but my situation is not the case.
In case of I2C, is usually does, as the bus is a pull-up tear-down mechanism. With significant current when somebody pulls the lines low. However, reducing the smaller part of the current consumption is still better than having both currents. The effect is just not that large.Damian Gowor said:What if the transmission itself takes more current than processors core?
You did. Well, there is no 'one fits all' solution.Damian Gowor said:Hope I have explained it clearly.
You can still do a busy-waiting transfer and just read the one byte you need. You'll wait for the STT bit to clear, and then set stop immediately, causing only one byte to be received. When MCLK is slow (e.g. 1MHz on 100kHz I2C clock), then it's only 10 MCLK cycles per I2C bit. ~100 cycles for sending the start byte, another 90 for receiving the byte. Considering the interrupt entry and exit time and the DCO wakeup etc, it won't waste much current on busy-waiting while saving a lot current by keeping the bus activity short.
**Attention** This is a public forum