This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EK-TM4C123GXL ADC will not trigger or set int flag RIS reg or ISC Reg

Other Parts Discussed in Thread: EK-TM4C123GXL, TM4C123GH6PM

Hi All,

This is driving me crazy and I've seen people mention Errata notes on this on a few threads but I can't get my head around the work around really.

(Ref. Tiva Errata Notes ADC#03 .. Sounds like my issue anyway !)

I am trying to use a EK-TM4C123GXL to do some simple single shot ADC conversions. I have done this on numerous other TI boards before with no issue (using both direct reg access and Tiva/StellarisWare API calls).

I have boiled my code down to the simplest form I can think of and it still gets jammed at the ADCIntStatus() call.

My Code is as follows (pretty much textbook ADC example)

ADC0 using PE1 (AIN2) on Sequence 

/* Set the System Clock Frequency */
	SysCtlClockSet(SYSCTL_SYSDIV_2|SYSCTL_USE_OSC | SYSCTL_OSC_MAIN |SYSCTL_XTAL_16MHZ);

	/* Global interrupt Enable */
	IntMasterEnable();

        //Clock gate enable for ADC0
	SysCtlPeripheralEnable(SYSCTL_PERIPH_ADC0);

	/* Using PE1 so we have to clock gate the PORTE peripheral */
	SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOE);

	/* Set PE1 as an ADC input pin */
	GPIOPinTypeADC(GPIO_PORTE_BASE,GPIO_PIN_1);



	ADCSequenceConfigure(ADC0_BASE, 3, ADC_TRIGGER_PROCESSOR, 0);

	int i=0;
	ADCSequenceDisable(ADC0_BASE, 3);//Good idea to disable before step config

	for(i=0;i<150;i++);

	ADCSequenceStepConfigure(ADC0_BASE, 3, 0,
			ADC_CTL_IE | ADC_CTL_END | ADC_CTL_CH2);


	ADCSequenceEnable(ADC0_BASE, 3);

	for(i=0;i<150;i++);

	//Enable ADC sequence int
	ADCIntEnable(ADC0_BASE,ADC_INT_SS3);


	//Enable int source
	ADCIntEnableEx(ADC0_BASE,ADC_INT_SS3);//Don't think is strictly needed

	/* Enable the NVIC to recognise the ints */
	SysCtlIntEnable(INT_ADC0SS3);

	//Trigger ADC
	ADCProcessorTrigger(ADC0_BASE, 3);


	while(!ADCIntStatus(ADC0_BASE, 3, false))
	{
		//Loop until ADC process completes and generates raw interrupt
	}
           uint32_t ui32Value;
           ADCSequenceDataGet(ADC0_BASE, 3, &ui32Value);

        while(1)
         {
             //Loop Forever
          }

But it always hangs at the ADCIntStatus() call.

I can only assume I am doing something very silly but I can't see it. Any ideas ? All help appreciated 

Notes:

>I am using CCS 

> I have tried most of the above with direct reg accesses and get the same result. I have tried changing the sequence and input pin with same result. 

Am I correct in saying , that in the uC data sheet in the ADC section, when listing the signals (See page 802 of the TM4C123GH6PM Data sheet) that the AINn (where 'n' is a number) is the number you use for the channel in the step configuration call.

e.g. PE1 is AIN2 n the data sheet. so we use CH2 ? see below

ADCSequenceStepConfigure(ADC0_BASE, 3, 0,ADC_CTL_IE | ADC_CTL_END | ADC_CTL_CH2);

Thanks, I'm sure this is a silly issue as I've said

Jay





  • Hello Jason,

    The ADC requires that the System Clock be at least same as the 16MHz conversion clock, whereas in the code you have the system clock is 8MHz

    Regards

    Amit

  • Jason Long said:
    This is driving me crazy

    Me too!

    Agree w/Amit's mention of 16MHz as minimum system clock.

    And - while you claim code as, "pretty much textbook" - this code is "not" in my textbook!

    a) ADCSequenceConfigure(ADC0_BASE, 3, ADC_TRIGGER_PROCESSOR, 0);

    int i=0;
    ADCSequenceDisable(ADC0_BASE, 3);//Good idea to disable before step config

    Here you first Sequence Configure - then Sequence Disable!  That's not in any (proper) textbook - it's backwards!

    b) You later, ADCIntEnable(ADC0_BASE,ADC_INT_SS3);

    Now we still use StellarisWare - and find, ADCIntEnable(ADC0_BASE, 3); to work

    There may be more - those come across via quick/dirty code read.

    In general - I don't like your code sequencing.  Suggest this (works well, long time - for us)

    IntDisable

    ADCIntDisable

    ADCSequenceDisable

    ADCSequenceConfig

    ADCSequenceStepConfig

    ADCSequenceEnable

    ADCIntEnable

    IntEnable

    Where/when we use ADCProcessorTrigger it usually follows ADCSequenceEnable.

    Believe this will get you much closer - perhaps escape you from, "krazy."  (you may wish to heave past/reference "textbook" - while you're at it...)

  • Jason Long said:
    ADCIntEnableEx(ADC0_BASE,ADC_INT_SS3);//Don't think is strictly needed

    There is no ADCIntEnableEx(), at drivers library. Verify this.

    - kel

  • Thanks Amit that did the trick.

    Am I correct in saying that I could change the ADC conversion clock in the ADC clock config regs ? The "ADCCC" reg in the data sheet ? So instead of bringing my system clock up to the 16MHz I can bring the ADC conversion clock down to the 8Mhz. Just out of interest !

    -cb1

    Thanks so much to the reply. While I have confirmed the code working with where I  originally had placed the ADCSeqeunceDisable() and where your comment suggests to, I find your "sequence of events" much more logical. (And I assume better practice in general ?)

    -kel

    The ADCIntEnableEx() function is listed in the TivaWare Peripheral driver library user guide on page 33. Where it's description is as follows:

    "This function enables the indicated ADC interrupt sources. Only the sources that are enabled
    can be reflected to the processor interrupt; disabled sources have no effect on the processor. "

    I have pasted the function's implementation to save you going off searching if you are interested.

    (From adc.c)

    void
    ADCIntEnableEx(uint32_t ui32Base, uint32_t ui32IntFlags)
    {
        //
        // Check the arguments.
        //
        ASSERT((ui32Base == ADC0_BASE) || (ui32Base == ADC1_BASE));
    
        //
        // Enable the requested interrupts.
        //
        HWREG(ui32Base + ADC_O_IM) |= ui32IntFlags;
    }

    Again, it boiled down to my silliness, The inevitable conclusion !

    Thanks for all the help guys! Apologies for the issues rudimentary nature.

    Jay 

  • Hello Jason,

    It would not be possible to use the ADCC register in the manner suggested.

    The ADCCC register allows two options

    1. PIOSC of fixed 16MHz

    2. PLL VCO Clock pre-divided to 16MHz or MOSC (at least 16MHz).

    Since the ADC Functional Clock must remain 16MHz, the System Clock requirement freezes at 16MHz. If however you can use a lower frequency Crystal, the two must still have the same relation.

    Please do note that all electrical parametric data is for 16MHz, so a lower crystal frequency (though should not affect) may have some affect.

    Regards

    Amit

  • Amit, my thanks for clearing that up.

    Regards,

    Jay