Hello
I am converting ratio-metric voltage output of a VTI SCA103T inclinometer to digital using ADS1252. The output of the inclinometer ranges from 0 to 5 volts. When the inclinometer is tilted to its maximum in one direction output 1 of the inclionometer reads 5 Volts and output 2 reads 0 volts, tilted the other direction the voltages flip in reverse of the description previously. When level they both read around 2.5 volts which should mean the adc will perform 2.5-2.5 equals zero. The ADC is supposed to take the difference of these and convert it to digital. So at zero degree inclination, I should see zero volts (the offset value) come out of the ADC. I can measure the outputs of the inclinometer with a multi-meter which are both at 2.36 and 2.42. Close enough to zero for right now. These go into the V+ and V- inputs of ADC1252. The ADC is giving me -2.26 volts out. Not 0.06 volts out as my multimeter would imply. What is going on here? The Vref on the ADC1252 is tied to VDD which is 5 volts. The ADC is hooked up as shown in the data-sheet. I am not too worried about noise yet as I cannot even get close to the right value out. This is the second ADC1252 chip I have tried and I am still having the same issue of it not seeming to be converting the difference between the two terminals and giving me a seemingly random value.
To convert from the digital offset twos complement from the ADC I subtract the offset which is (2^24)/2 from the ADC value This removes the offset. Then I multiply by the VperLSB which is 5V/2^24 and then convert result to decimal and display. It seems the bits coming out of the ADC are wrong. One place where things may be wrong is that I am using a 32 bit microprocessor thus the 24 bits from the ADC are padded with zeros at the MSB of the value when the ADC value is stored. However, in all my calculations this should not cause a problem. The only other thing I can think of is that in reading the data sheet again last night and on page 7, I saw this statement:
"The bipolar input range is from -4.096V to +4.096V when the reference input voltage equals +4.096V; the bipolar range is with respect to -Vin, and not with respect to GND."
I had ignored it previously because I am not using a bipolar signal, however, now I am thinking that perhaps since I am only using half the range and the range may be centered at zero bipolar input, while I am centered at 2.5V thus I need to do some more manipulation with the output to get it to correlate with the actual voltages coming out of the inclinometer. Any ideas?
[Steps Needed to Recreate Problem:
Here is a sample of the Inputs to and Outputs of the ADC and my program:
Vref = 4.79 V
VperLSB := %00110100100110010100011110101110
which is 4.72V/2^24 in IEEE-754 format
Vper LSB in decimal = 2.813339 E-07
At close to zero degree angle:
V+2.37 VDC
V- 2.38 VDC
Raw ADC Value: 00000000000001101101000101100110
ADC Value - Offset: 11111111100001101101000101100110
Vout = -2.26743 V
*I converted ADC value to voltage by multiplying the signed decimal equivalent of the( ADC Value - Offset) as calculated above by the VperLSB
As one can see the Vout value is quite high I tried to post a picture of the schematic but my computer is not allowing it and I cannot copy and paste it directly in here for some reason.
Hopefully someone who is more knowledgeable then me in this area can help me?
Thanks
Jonathan