Other Parts Discussed in Thread: C2000WARE
Tool/software:
Hello,
I'm stuck on an issue with the F28377S where, apparently, my ADC readings have lost 3 bits of resolution. The values are resolving to 0x___8 or 0x___0, and when I sweep the input voltage to the circuit, it only steps up or down in increments of 16 or 32. (e.g., 0x0118 -> 0x0128 -> 0x0148)
The ADC is configured in 12-bit single signal mode, and I used the C2000ware function to set it.
I have tried slowing down the ADCCLK and maximizing the acquisition window for each sample, but there has been no change. If this was a matter of settling time I would still expect at least some amount of noise in the lower bits, even if they were inaccurate.
Additionally, the values read out have been "flickering" between significantly different values - for instance, with the input signal in a steady state, the ADC result would jump between 0x0A58 and 0x0A78, without apparently hitting any other values.
In some ways it is acting as if the values are left-shifted in the result register, but as they are, they are close to what I expect - just missing precision.
I thought at first this was an artifact of monitoring the registers in Code Composer, but I read out the results to debug variables and calculated the deltas in readings with the same results.
I do not have any post processing active.
Are there any configuration options that I've missed that would explain this truncation? I've read that accuracy can be lost because of an improperly designed input circuit, but in that case I would still expect some amount of noise in the lower bits.
Any other suggestions are also appreciated.