This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1232: Can not get Offset Calibration working

Part Number: ADS1232

Hi, I am attempting to measure distances around 2cm with an accuracy of, ideally, 0.1mm. I started by getting a Sharp GP2Y0A51SK0F which produces readable (analog) voltages at distances of 2 to 15cm.

I can use the range 2 to 3 (or maybe 4) cm for my purposes. Now I want max accuracy so I turned to a 24 bit ADC as the Arduino microprocessor has 10 bit AD pins. I am going to use the Ti ADS1232.

I was happy to measure from 0 to 4cm even though the first 2cm would be useless as the voltage peaks around 1.4 cm and only become linear enough to estimate easily at 2cm.

Then I noticed that the Ti chip (bless them) has "Offset calibration". Sorry to display my ignorance but as this is not a stored value but an empiracaly measured offset, does that mean during start up I should put a solid object in front of the sensor, temporarily, at exactly 2cm distance so it can note the voltage and then it will allow for that and hence be more accurate over the range I need?

Apologies again for my ignorance. If I find a wikipedia entry on offset calibration and it all makes sense I will get straight back and cancel this nob question :)

  • Hi Stephen,

    Welcome to the E2E forum! The ADS1232 offset is not a system offset, but rather a device offset calibration.  When the calibration is made, the device places an internal short within the device and sets an internal register to correct for offset by subtracting the value from the conversion result for all future conversions.

    The offset calibration can be issued at any time to correct for any temperature drift effects of the ADS1232.  The offset calibration should be done after power up, a change of gain, and at any other time the user desires.  The calibration can be initiated by sending a minimum of 26 SCLKs in a row.  If you are using a micro SPI peripheral, clocks are usually sent in byte increments, so in this case transmitting 4 bytes (32 SCLKs) will initiate the calibration cycle.  Valid conversion data will be available when DOUT/DRDY transitions from high to low state.

    Best regards,

    Bob B 

  • Hi Bob, many thanks for the explanation of the calibration. I will certainly ensure this happens on start up.

    Based on the quation

    I anticipate crazy accuracy without adjusting the voltage so I will get it working now and see how it pans out.

    Thanks again, Steve

  • Having said I was happy...The idea of optimising the sensor does appeal and is a good learning experience as well.

    I note the ADS1232 has programmable gain amplifier but the options are a bit limited being ±2.5V, ±1.25V, ±39mV, or ±19.5mV

    For the Sharp GP2Y0A51SK0F between distances of 2 and 4cm I have voltages from (approx) 2.1v to 1.6v which is a range of 0.4v.

    I was hoping there would be a way to opt for a reference voltage and use 2.1v which, if I need to produce it at every startup I can reproduce easily by connecting the Sharp sensor with an object exactly 2cm away from it. Is there such a thing? I know the humble Arduino can do this with its built in analog pins (only 10 bit ADC)...

    Thanks in advance, Steve

  • I now notice (It helps to read the docs) that there are indeed two pins entitled VREFP and VREFN. I immediately thought- great I can use the Sharp sensor with a distance of 2cm and 4cm. But they will need their voltages simultaneously so can you suggest any solution? Also it looks like VREFN must be 1.5v lower than VREFP wheras I need 0.8v between them. (Think I said 0.4V before but that was over 1cm range which is too tight for the general case for my project.) If I have to use 1.5v then I am doubling the interval between discrete values as you will appreciate.
    I could obviously record these values beforehand, but there seems to be no way to do it in software (?)

    Apologies again for my lack of experience in these matters. I am learning albeit slowly.

  • Hi Stephen,

    The Sharp sensor you wish to use shows the output as an analog voltage relative to ground. This is a single-ended measurement and you will not be able to use gain of more than 2 with the ADS1232. The ADC uses a differential input, so in this single-ended case AINN would connect to ground and AINP would connect to Vo of the sensor.

    The analog input range (the actual voltage at the input pins) at a gain of 1 or 2 using the ADS1232 is from GND-300mV to AVDD+300mV, while at a gain of 64 or 128 the input range is reduced to between GND+1.5V to AVDD-1.5V (see common-mode input range in the table on page 3 of the datasheet). The actual measurement range is based on the reference voltage where the full-scale range (FSR) is +/- 0.5*VREF. As one input is fixed (AINN=GND), then only 1/2 of the FSR will be used (as AINN can never be greater than AINP).

    As the sensor output can go to almost 2.5V relative to ground, you would need to use a gain of 1 with 5V reference while using the ADS1232. Although your interest is a difference of around 400mV, the ADC can only measure the output voltage of the sensor. You would need to determine the distance by calculation. Often times the easiest method is by creating a piece-wise linear lookup table as the output of the sensor is non-linear. The lookup table can be thought of as a switch statement and a series of cases from within C.

    For example, if the voltage is between two values where the slope of the output voltage is mostly linear (like from 2cm to 4cm) you would calculate the distance using one particular formula when the output voltages are between 2.1V and 1.3V. For other pieces of the output curve you would break it up in small enough chunks where the line is mostly linear.

    As to calibration, I would first try a single point calibration as the diode characteristics should follow the same curve with only the offset of the Sharp device needing correction. So you could place the device at the 2cm point and measure the voltage and then correct any difference in software from the curve by subtracting the voltage difference from the conversion result prior to determining the distance from the lookup table. Let's say the output voltage at 2cm should be 2.1V, but the ADC is measuring 2.12V. You would then need to subtract the difference of 20mV (2.1-2.12) from all remaining measurements, then go to the lookup table to calculate the correct distance.

    Best regards,
    Bob B