I've been trying to set up a pin to monitor battery voltage on the CC2541 by measuring a partial voltage created by a voltage divider. Pretty straight forward, and it seems to be working great, only I can't figure out how to correlate the ADC readings back to the measured voltages in a way that makes sense. First just to make sure I'm not missing anything obvious in my code:
Initialization (called once on powerup)
P0SEL &= ~BIT0; //GPIO
P0DIR &= ~BIT0; //Input
P0INP |= BIT0; //Disable Pull-up
Conversion (called when measuring)
uint16 adc_result = 0;
APCFG |= APCFG_APCFG0; //Configure P0_0 to ADC
ADCCON3 = ADCCON3_EREF_AVDD | ADCCON3_EDIV_512 | ADCCON3_ECH_AIN0; //Start single conversion: AVDD5 ref, 14 bit resolution, P0_0
while( !(ADCCON1 & ADCCON1_EOC) ); //wait for conversion to finish
APCFG &= (APCFG_APCFG0 ^ 0xFF); //Unconfigure P0_0 to ADC
adc_result = (ADCH << 8); //Shift high result
adc_result |= ADCL; //OR to low result
return (adc_result >> 2); //Shift and return
Originally I had the APCFG call made only once in the initialization routine, but looking into the HAL code I mimicked their function by switching it on and back of for each conversion. This didn't make any difference.
The numbers I get back are close, and they follow a very good correlation, I just don't know how to work back from them to measured voltages. What I have been assuming, is that for a single-ended, single conversion, I could get back to the voltage by:
(ADC_Result / Full_Scale_Range) * Reference_Voltage
I know ADC_Result from the CC2541, and I can measure the Reference_Voltage (3V). If I assume the Full_Scale_Range is about 7400, everything lines up perfectly, but 7400 seems like a random number so either my calculation is wrong, or I'm messing up the conversion. In other testing using the internal 1.24V reference voltage it appears the ADC caps out at 0x1FFF which would set the full scale range to 8192. This makes sense and is sorta kinda close to 7400, but not close enough to leave me confident in the result.
Thanks