After seeing what I initially thought was a glitch in data collected through an ADS1278 ADC, after much testing I believe I am seeing a shift in the output data but cannot explain why.
Test setup: I am using the ADS1278EVM-PDK. The MMB0 is used only to power the ADC board, the data header is not connected between the boards. I am reading data via the SPI interface into a microcontroller on a different board. The ADC is using the on-board reference, an off board clock, buffers bypassed. I am feeding it with a single ended input with AIN- tied to VREFP (though have tried other setups there, that doesn't seem related). I am primarily using high resolution mode but have tested all 4 modes and all 3 SPI formats. CLK and SCLK are not synchronous.
Failure condition: I am able to read data without any issues except under the following condition: when the differential input voltage exceeds Vref/2 (codes 0x400000 to 0x7FFFFF) on the first enabled channel. It never happens when the diff input voltage is < Vref/2. And it only happens if this voltage appears on the first enabled channel for a given data bit (ch1 in TDM:fixed, first enabled channel in TDM:dynamic, all channels in discrete format). This behavior is extremely consistent (never happens without this condition). The frequency of bad samples seems to increase with sampling rate (few a few times per second to hundreds). I have tested many combinations of SPI clock and ADC clock speed, as well as all 4 ADC operating modes. The problem is present in all cases given the above condition, but seems more likely to occur at high sampling rates. I have been primarily testing with 6 or 12MHz clocks.
Observed behavior: When the problem occurs, the entire contents of the data register appears to be shifted left one bit, with the exception of the first bit (high bit of first channel), which is always 0. All channels are shifted left one bit, though only the first will have the top bit set to 0, regardless of the "shifted" value.
Conclusions: I have observed the SPI lines with a logic analyzer for a few of the bad samples and the value on the lines appears to match what is read by the microcontroller. I have been initially investigating this as an error in my implementation of the microcontroller SPI interface but the behaviors I've seen don't seem to agree with that. Namely that the behavior depends on the input voltage level and what channel that voltage appears on. The microcontroller reading a series of bytes over SPI should be indifferent to those conditions.
If there is an ADC timing condition I am violating, I have not been able to determine it. There is a 1 t_CLK minimum duration between DRDY falling and the first SCLK rising edge. My microcontroller averages around 2us and isn't capable of responding in < t_CLK (83ns). There is a note in the datasheet about SCLK/CLK being a power of two ratio for "best performance" (though that doesn't seem to be defined). Changing that ratio doesn't seem to change the rate of errors, though the clocks are not synchronous (I didn't see that stated as a requirement anywhere and haven't been able to test that yet).
Any insight into what I am doing wrong here would be greatly appreciated.