Hi, I am attempting to measure distances around 2cm with an accuracy of, ideally, 0.1mm. I started by getting a Sharp GP2Y0A51SK0F which produces readable (analog) voltages at distances of 2 to 15cm.
I can use the range 2 to 3 (or maybe 4) cm for my purposes. Now I want max accuracy so I turned to a 24 bit ADC as the Arduino microprocessor has 10 bit AD pins. I am going to use the Ti ADS1232.
I was happy to measure from 0 to 4cm even though the first 2cm would be useless as the voltage peaks around 1.4 cm and only become linear enough to estimate easily at 2cm.
Then I noticed that the Ti chip (bless them) has "Offset calibration". Sorry to display my ignorance but as this is not a stored value but an empiracaly measured offset, does that mean during start up I should put a solid object in front of the sensor, temporarily, at exactly 2cm distance so it can note the voltage and then it will allow for that and hence be more accurate over the range I need?
Apologies again for my ignorance. If I find a wikipedia entry on offset calibration and it all makes sense I will get straight back and cancel this nob question :)