This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TM4C123GH6PM: ADC Questions

Part Number: TM4C123GH6PM

Tool/software:

I have some general questions about ADCs in TM4C. 

1. What are the differences between single-ended and differential input configurations?

2. What exactly are the sample sequencers and step sizes? And how are they related to the hardware oversampling?

From my understanding, sample sequencers capture data and the number of samples captured by each sequencer is different i.e. SS3 = 1, SS2 = SS1 = 4, SS0 = 8 samples. Also, for example, if the hardware oversampling is set to be x4, SS0 (suppose) would capture 32 samples in total (8 samples times 4)?

3. The datasheet says "Most of the ADC control logic runs at the ADC clock rate of 16 MHz. The internal ADC divider is configured for 16-MHz operation automatically by hardware when the system XTAL is selected with the PLL." Does this mean that if I select my system clock frequency = 80MHz, the ADC will automatically scale down ADC clock rate to 16MHz?

  • Never mind on the first question. 

  • A follow up question: What's the point of having 12 input channels when there are only 8 sample sequencers in total? Is it so that we have more pin options for ADC inputs?

  • for example, if the hardware oversampling is set to be x4, SS0 (suppose) would capture 32 samples in total (8 samples times 4)?

    Let's say SS0 is configured with two steps to sample AIN0 and AIN1. The other six steps are not used in this case. With 4x oversampling, AIN0 would be sampled 4 four times and the average of the four samples are stored into the FIFO. Next, AIN1 is sampled 4 times and the average of the four samples is stored to the next FIFO location. Since there are only two steps configured for SS0, an interrupt will be generated upon finishing the averaging for AIN1. Hope this is clear. 

    3. The datasheet says "Most of the ADC control logic runs at the ADC clock rate of 16 MHz. The internal ADC divider is configured for 16-MHz operation automatically by hardware when the system XTAL is selected with the PLL." Does this mean that if I select my system clock frequency = 80MHz, the ADC will automatically scale down ADC clock rate to 16MHz?

    That is correct.

    A follow up question: What's the point of having 12 input channels when there are only 8 sample sequencers in total? Is it so that we have more pin options for ADC inputs?

    For example, you can allocate 8 channels to SS0 and 4 channels to SS1. With this allocation, all 12 channels can be samples, not in one sequencer but in two sequencers. 

  • Does SS0 take the most time to collect data since it has the largest FIFO?

  • Hi,

      It depends on how many channels you are trying to sample using SS0. Let's suppose you have 4 channels and you allocate one channel to each Sequencer. In another word, one channel for SS0, one channel for SS1, one channel for SS2 and one channel for SS3. The time it takes to sample a channel from each sequencer will be the same. Of course the SS0 will take the longest time to finish conversion and generate interrupt if you allocate 8 channels to SS0 compared to only one channel for SS3. 

  • That makes sense. Thank you!