I2c Loop. Want to read one byte from slave but understand I have to read a minimum of 2.
while (1) { i2cread(0); }
MY I2c is working after I added a delay after sending restart to the slave, but why is that required? Before I add any delay, I would miss every other byte, but with the delay added, I catch all the data.
unsigned int i2cread(char registry) { while (UCB0CTL1 & UCTXSTP); UCB0CTLW0 |= UCTR + UCTXSTT; // I2C TX, start condition while ((UCB0IFG & UCTXIFG0)==0); UCB0TXBUF=registry; while (UCB0CTL1 & UCTXSTT); //send restart UCB0CTLW0 &= ~UCTR; UCB0CTLW0 |= UCTXSTT; __delay_cycles(100000); //without delay here, I only pickup every other byte while (UCB0CTL1 & UCTXSTT); UCB0CTL1 |= UCTXSTP; } /////////////Interrupts////////////////////////// #pragma vector = USCI_B0_VECTOR __interrupt void USCIB0_ISR(void) { switch(__even_in_range(UCB0IV,0x1E)) { case 0x04: UCB0CTL1 |= UCTXSTP; break; // Vector 26: TXIFG0 break; case 0x16://RX
P3OUT ^= BIT0; *bufferptr=UCB0RXBUF; bufferptr++; break; default: break; } }
Also, this seems funny as I was probing the SDA line along with my RX interrupt (P3.0 XOR when interrupt received). My msp430 keeps entering the ISR, which tells me UCB0RXIFG never gets cleared until the stop bit is set. See below for better explanation. Top line is SDA, while bottom line is P3.0 (XOR on every rx interrupt). After a start and restart, I keep entering my ISR during that long delay I put in. Interrupt clears only after I send a stop bit.
Please help me clear this up. I have it working but I don't understand the problem completely, and would like to know why it works the way it does.