This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F5172: ADC input behavior at lower voltages

Part Number: MSP430F5172

I'm using an MSP430F5172 uC  and reading a battery voltage at 500ms intervals.  Here's the circuit below for sampling the battery voltage:

When EN_VBATT_READ goes high, the voltage divider is active.  The divider divides down at a ratio of 1:0.7848  so that a max input voltage from VBATT = 3.2V results in an output voltage of ~2.5V at VBATT_DIV.     The MSP430 is set up with a VREF=2.5V so 3.2V input gives full scale 10-bit output of 0x3FFF.   When EN_VBATT_READ goes low, the divider is off, no current is consumed.

Cap C32 is there to act as a LPF but also to help hold up the voltage during the ADC sampling time.   The relatively low values of the resistors in the divider allow me to achieve a low input impedance to the ADC.

At VBATT input voltages between 3.2V ~ 2.1V, the ADC readings are spot on, within 1 or 2 counts.   Nice.    The scope shot below shows the resistive divider turning on ( YELLOW trace is VBATT_DIV ).  There is a delay for  settling on C32 to charge up;   then the green trace goes high ( start of ADC conversion ), then the green trace goes low ( end of ADC  conversion )

As VBATT input goes lower, the VBATT_DIV waveforms gets weird.  By weird I mean it doesn't reach the steady-state value I expect, and when the conversion is complete, there is a little hump.  VBATT input for the scope shot below is 2.0V:

And at 1.9V, the VBATT_DIV waveform is not stable at all and has a big dip:

The end result is the low end of VBATT input voltage reads are far off from what is expected.   My guess is that the ADC is loading the analog input pin more than I expected.   Datasheet states ADC input impedance is anywhere from 36k to 96k & 3.5pF, which is much higher than the ~887 ohm impedance of the divider.    So I don't see how this could be the case.

Also, I don't understand why this effect wouldn't be seen for higher input voltages too.   I've tried all combinations of sampling time and conversion clock...in the end it doesn't really matter.   Lower input voltages also give this bumpy waveform in the  VBATT_DIV analog input pin.

Supply voltage to the uC is stable and normal at 3.3V throughout all of these waveforms.  Probing on AVCC and DVCC shows a stable and noise-free 3.3V. 

MCLK is externally generated 16MHz and unconditionally stable also.

Relevant code that sets up the analog input pin P3.7 and does the ADC reads:

    // P3.7 -> VBATT_DIV / A6 ADC channel
    P3DIR &= ~(BIT7);
    P3SEL |=   BIT7;
    P3REN &= ~(BIT7);
    // Unlock port mapping controller
    PMAPKEYID = PMAPKEY;
    // Prohibit any further remapping to occur
    PMAPCTL &= ~(PMAPRECFG);
    // Set P3.7 as A6 ANALOG input pin
    P3MAP7 = 31;
    // Re-lock port mapping controller
    PMAPKEYID = 0x00;

    ADC10CTL0 = 0x00;                         // Reset reg contents
    ADC10CTL0 |= ADC10SHT_4 + ADC10ON;        //  S&H=64 ADC clks, ADC10ON

    ADC10CTL1 = 0x00;                         // Reset reg contents

    // ADCCLK = MCLK; SAMPCON sourced from sampling timer; input clock / 8
    ADC10CTL1 |= ADC10SSEL_2 + ADC10SHP + ADC10DIV_7;

    ADC10CTL2 = 0x00;                    // Reset reg contents
    ADC10CTL2 |= ADC10RES;               // 10-bit conversion results

    ADC10MCTL0 = 0x00;
    ADC10MCTL0 |= ADC10SREF_1;           // Select V(R+)=VREG and V(R-)=AVSS
    ADC10MCTL0 |= ADC10INCH_6;           // A6 ( P3.7 ) ADC input select

    ADC10IE |= ADC10IE0;                 // Enable ADC conv complete interrupt

inline void takeAdcRead_ch6(void)
{
    // Enable the VBATT divider
// The line below pulls EN_VBATT_READ high P3OUT |= BIT4; // Wait at least 75us for C32 // filter cap to charge to sample value __delay_cycles(2400); // Sampling and conversion start // ADC ISR triggers when result is ready P1OUT |= BIT5; // This is the green trace pulled high ADC10CTL0 |= ADC10SC + ADC10ENC; }

// ADC10 interrupt service routine
#pragma vector = ADC10_VECTOR
__interrupt void ADC10_ISR(void)
{

    switch(__even_in_range(ADC10IV,12))
    {
        case  0: break;                          // No interrupt
        case  2: break;                          // conversion result overflow
        case  4: break;                          // conversion time overflow
        case  6: break;                          // ADC10HI
        case  8: break;                          // ADC10LO
        case 10: break;                          // ADC10IN
        case 12:                                 // Conversion done, ADC10IFG0 has been set,
                                                 // ADC10MEM0 is ready

            P1OUT &= ~BIT5;       // This is the green trace being pulled low


            // Store new result
            adc_ch6_result = ADC10MEM0;

            // Turn off resistive divider, we are done with current ADC sample frame
// This line pulls EN_VBATT_READ low P3OUT &= ~(BIT4); // Tell main() we have a new adc sample to process sysFlags.adcSampleReady = 1; break; default: break; } }


Any ideas what might be the matter?  Why does the ADC pin seem to be loading the input pin when the ADC is on and the voltage is lower than 2.1V? 

  • I'm not a EE,but my EE keeps reminding me that as a battery approaches depletion its output resistance increases, so voltage droop happens at lower currents.

    As I recall, a Li-Ion at 2.1V is pretty much depleted.

  • As VBatt decreases so does the gate to source voltage on Q2B. Which means an increasing drain to source resistance.

  • I mentioned this above, but perhaps it was not clear:   The supply to the uC is a stable 3.3V provided by a TPS610994.   Probing the DVCC and AVCC pins of the uC shows a stable and glitch-free 3.3V rail with minimal noise.   In this test setup, VBATT is actually just connected to a bench top power supply.  I vary the supply from 3.0V down in 0.1V steps.   The ADC returns correct values for the divided-down VBATT input ( which is actually the bench power supply right now ) until about 2.1V, at which point the VBATT_DIV waveform becomes distorted as seen in the yellow scope shots above. 

  • AVCC and DVCC are not connected to the source of Q2B. VBatt is. If the gate is pulled down to ground then Vgs is -VBatt.

    Sure Q2A gets nice steady gate drive, but Q2B does not. As VBatt decreases its on resistance increases.

  • You're right ... Vth of this PMOS is -2.6V max, -1V min ... so I think you hit the mark on this one.  Thanks for pointing out this amateur error of mine.  

**Attention** This is a public forum