This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TM4C123GH6PM: Tiva C Series ADC Modules

Part Number: TM4C123GH6PM


Hello,

I'm looking into using the internal ADC modules on the TM4C123GH6PM controller. However, after taking some time to read the datasheet and forum posts, I'm still not sure how the ADCs are configured by default.

The datasheet states that the maximum sampling rate is 1Msps. Later on I read that this is achievable by setting the ADC sample phases on both modules appropriately. So what is the sampling rate of a single ADC module by default? Is the ADC resolution affected by the sampling rate (as is the case with many ADCs), or is it always the maximum of 12 bits? I couldn't find this information in the datasheet.

For the purposes of my application, I only need one ADC channel (single-ended) with maximum (64 sample) averaging and maximum (12 bit) resolution.

Many thanks,

Ksawery

  • Hello Ksawery,

    The datasheet isn't really where you want to be reading up on how to configure the ADC. You should read the TivaWare Driverlib User's Guide and review our example code projects.

    The User's Guide is: https://www.ti.com/lit/pdf/spmu299

    Examples in TivaWare can be found under [Install Path]\TivaWare_C_Series-2.1.4.178\examples\peripherals\adc

    I can also share an example for ADC with uDMA if that's of interest.

    And for highest sampling rate, you can see this E2E post - it is for the TM4C129x so the ADC module and clocking is different, and that is why the speeds are higher, but it will give you a solid idea of how to put together a similar setup for the TM4C123x: 

    On the topic of resolution, the device would try and get all 12 bits of resolution always, but I don't think we have measured what the ENOB would be at those speeds based on SNR which is your real result. It's possible that the performance could be more like 10-11 bits with ENOB factored in.

    The sampling rate is determined on configuration, you need to setup the ADC clock or a timer.

  • Hello, thank you for your message and apologies for my late reply.

    I'm more concerned with achieving 12-bits of resolution than the sampling rate. At the defualt configuration, what would be the sampling rate and resolution of a single ADC module?

    Ralph Jacobi said:
    On the topic of resolution, the device would try and get all 12 bits of resolution always, but I don't think we have measured what the ENOB would be at those speeds based on SNR which is your real result. It's possible that the performance could be more like 10-11 bits with ENOB factored in.

    So the ENOB is affected by the sampling rate? Is the 12-bit resolution guaranteed at the default ADC configuration?

    I will be looking into implementing this tomorrow, just wanted to make sure I'm on the right track.

    Best regards,

    Ksawery

  • Hello Ksawery,

    Ksawery said:
    So the ENOB is affected by the sampling rate? Is the 12-bit resolution guaranteed at the default ADC configuration?

    The ENOB is affected by the Signal-to-Noise + Distortion Ratio. The D/S specs are set based on the ADC frequency being 16 MHz, and the SNDR is 60 dB.

    The formula to calculate the ENOB is given in on Page 1390: ENOB = (SNDR - 1.76) / 6.02

    Based on this, ENOB = (60 - 1.76) / 6.02 = 9.67 bits.

    The SNR can be increased when the sample frequency is increased, but there are ways to offset or even improve on the results. For example, oversampling can be used to help improve the ENOB.

    As far as how much you'd need to increase the oversampling for 12 bit ENOB with the ADC, I wasn't able to pull up any clear resources on that. There is a lot of information around on the web about ADC operation theory, ENOB, Sampling Rate, Oversampling etc. - if you want to search around a bit and then ask some questions based on what you find (ideally providing links) then I can give you my thoughts on how they may apply for TM4C.

    Depending how strict your requirements are, you may also need to consider an external ADC.

  • Mixed Signal, ever expanding, (MCU as Kitchen Sink) Greetings,

    First - my team wishes to applaud the vendor agent for his (courageous) mention: "Consider the use/employ of a dedicated, external ADC" should optimal ADC results be desired...   Well done - and such was the (strong) direction from my, ivy-covered, 3 letter, east-coast engineering school - decades ago!   And - still holds substantially true - even today.

    In further support of the "external ADC" - my small team employs a broad mix of ARM Cortex MCUs: (M0, M3, M4, & M7) from multiple vendors.   (it should be well known that, "No one vendor can - at all times - provide the optimal solution!")    Both our investors & key clients "demand" that we choose & present a variety of "candidate MCUs" (often supplied upon our firm's custom designed/developed "Multi-MCU" eval board) - to "Speed, Ease & Enhance" such competing MCU evaluation.

    Our findings - rarely can (any) selected, (12 bit ADC specified) ARM Cortex M4 MCU provide, "10 bit accuracy."   The past predicted "Mixed-Signal" penalty - imposed upon an ADC implementation (while somewhat reduced) surely continues...    Consider too the case when/where the input signal must be "gained up" prior to being presented to the MCU.   This further degrades the "mV/ADC count" - does it not?

    Note that "higher resolution" (external) ADCs are often chosen (both) for their "increased accuracy" but also for their "Increased dynamic signal range."

    The '123 MCU in question here includes separate VDDA & GNDA pins.    Both should be individually treated/isolated for best ADC results.    (Note that the '123 LPad ties these to the "noisy" VDD & GND - lowering ADC performance!    Yet enabling (much desired) lowered cost...)

    Never mentioned here (but for our group) - "Development w/an External ADC tremendously "Speeds & Enhances" future (i.e. insured) MCU Migration!    (as the "ADC Learning Curve" is (then) ZERO - and the (external ADC/support components') pcb footprints & routings are (easily) Cut/Pasted!)    Investors & Venture firms LOVE this!   "Re-Usability"  is "always" highly valued!

    The MCU's "manifest destiny" (expand to include all functions) causes necessary compromise in performance - which should be recognized...

  • Thank you for your help. Based on your suggestions and the examples provided with the Tivaware library, I've configured the ADC as follows:

    //ADC configuration
    SysCtlPeripheralEnable(SYSCTL_PERIPH_ADC0);
    GPIOPinTypeADC(GPIO_PORTD_BASE, GPIO_PIN_0);
    ADCSequenceConfigure(ADC0_BASE, 3, ADC_TRIGGER_PROCESSOR, 0);                            //configure sample sequencer (single-sample)
    ADCSequenceStepConfigure(ADC0_BASE, 3, 0, ADC_CTL_CH7 | ADC_CTL_IE | ADC_CTL_END);       //sample channel 7 (PD0) and generate interrupt
    ADCHardwareOversampleConfigure(ADC0_BASE, 64);                                           //64-sample hardware averaging
    ADCSequenceEnable(ADC0_BASE, 3);                                                         //enable sequencer

    I've configured the ADC interrupt handler as follows:

    void ADCIntHandler(void)
    {
        ADCIntClear(ADC0_BASE, 3);
        bADCActive = FALSE;
        bADCReady = TRUE;
    }

    The ADC trigger and data-get functions are called in the main() while(1) loop as follows (for specific reasons, I can't place these function calls in the interrupt handler):

    if (!bADCActive && !bADCReady)
    {
        ADCProcessorTrigger(ADC0_BASE, 3); //trigger ADC sample sequence
        bADCActive = TRUE;
    }
    if (bADCReady) { ADCSequenceDataGet(ADC0_BASE, 3, &ulADCReading); //get ADC reading usMBInputReg[0] = (USHORT) ulADCReading; bADCReady = FALSE; }

    Would this implementation work well? Perhaps it would be better to use the following code instead of the interrupt handler (as was shown in the TM4C123G LaunchPad Workshop examples)?: 

    ADCIntClear(ADC0_BASE, 3);
    ADCProcessorTrigger(ADC0_BASE, 3);
    
    while(!ADCIntStatus(ADC0_BASE, 3, false)) {}
    
    ADCSequenceDataGet(ADC0_BASE, 3, &ulADCReading);

    This solution seems much simpler and should be sufficient for my application, given that the sampling rate is 1MSPS.

    Many thanks,

    Ksawery

  • Hello Ksawery,

    If you are not going to do anything but set the flag in the ISR, then wouldn't it make more sense to just do a polling operation? Not clear if maybe the flag would be used to skip other tasks to prioritize the ADC though.\

    If you want to be more efficient with handling the ADC data you could use the uDMA. We have an example for that if you are interested? I can share it with you (It will be in the upcoming TivaWare release that is planned to be out in March).