I have to calibrate results.
Suppose I have adc input & measured 1.4V but in actual it was 1.5V.
So what will be calibration factor:
1. 1.5/1.4 = 1.071. So whenever I read a adc voltage I multiplied it by 1.071.
2. or. 1.5-1.4 = 0.1. Whenevre I measure a voltage I add 0.1 in add to get correct value.
3. Or I measure it in two extreme range. Let I have min value = 0V & max value = 2.5V.
On measuring I get x & y value respectively. So how to get mathematical relation in them