Other Parts Discussed in Thread: LM3S8971, EK-TM4C1294XL, TIDA-00909
It is taking unreasonable 400us for ADC converter cycles of 3 specific AINx channels, required for CADC to discharge between samples. CADC charge voltage is reluctant to go below some arbitrary threshold yet the ANIx input level reduces well below 1/4 VREFP amplitude <80mV. Hence 400us between triggering conversions allow for CADC to reluctantly discharge back to the arbitrary threshold first established, though not correct. The OPAMP output is consider settled in 9.6us to 0.5% of final value so the ADC is messing up in these conversions.
The OPAMP's are coupled 4.87k series resistance into single ended AINx inputs. The OPAMPs present <1nf load to each AINx input. Seemingly OPAMP should easily discharge CADC during push/pull cycles yet does not. What is going wrong CADC remains charged to arbitrary threshold yet FIFO's return 0V results after sampling stops? Conversion Interrupt occurs for the 3 AINx only during 400us triggering via GPTM one shot intervals. Seemingly plenty of time exists for CADC to discharge following AINx voltage changes as they fall below 1/4 VREFP. The application even forces the sample variable to 0x0 after each MIN/MAX evaluation each interrupt cycle. The variables and FIFO values are being drained, tested for under/over flow conditions. The application handling of FIFO results is not the problem as the hardware is misbehaving around CADC for what ever reason.
Why does CADC require so much time to discharge and what can be done in ADC configuration to correct behavior? The NSH set 0x4 or any other value has no effect to reduce the arbitrary CADC threshold, obviously floating FIFO value higher than AINx input voltage in this case.
Scope captures indicate CADC being charged only >400us settling: