Dear Sirs.
We are having an issue with our implementation of the I2C while reading data from an external ADC.
The sequence is:
1) Set I2C Slave Address
2) Set Mode to I2C Master
3) Set Direction to I2C Transmitter
4) Set Start
5) Send the Address
6) Check I2C_TX_INT in i2cREG1->STR
7) Read i2cREG1->DRR to clear the flag
8) Set Start
9) Set Direction to I2C Receiver
10) Receive 2 bytes from the ADC
11) Set Stop
12) Receive last byte (3rd)
13) Clear SCD
This algorithm works fine for many hours, but sometimes, in step 10, the Rx flag is set (cleared in step 7) before the actual byte has arrived to the I2C module, resulting in a reading of 0x00.
The result is that instead of reading byte1-byte2-byte3 we read 0x00-byte1-byte2, resulting in a bad ADC reading.
I've tried to change the sequence 7-9 so the flag gets reset just before entering 10), same result.
So it seems like the I2C module is setting the Rx flag erroneously.
Any clue on why this may happen and how to solve it?
Thank you