I need help determining the accuracy of a measurement system
I’m having trouble getting my hands around the accuracy of my measurement design. I’ve attached the schematic. There are 4 analog inputs (CH1± …. CH4±). Using relays and resistor dividers, they get multiplexed, attenuated and amplified by instrumentation amplifier with a gain of 30 and offset of 1.5Vdc. The signal is then fed into a 14-bit ADC.
For the purpose of this discussion let’s just focus on CH1±. On the 1st schematic you’ll see some gain and offset calculations for each of the channels prior to the ADC. In these calculations the term ‘rect’ is synonymous with ‘CH1±’ and also ‘channel 1’. You’ll see that for ‘channel’ I’ve calculated a gain of 0.006±0.86% with an offset of 1.5V±6mV.
The ADC is on the 2nd sheet. Again you’ll see some calculations including a copy of those done on the 1st sheet. In the bottom right of the 2nd sheet you’ll see some equations determining what the ‘channel 1’ input range is and the weight of each ADC bit wrt to the ‘channel 1’ input. Based on an ADC FSR of 3V and working backwards you’ll see an input range of ±250Vdc and a bit weight of 30.5mV for ‘channel 1’. You’ll also see how I convert the ADC reading to the input voltage.
I’m confused as to how to relate the aforementioned gain and offset accuracies of the signal going to the ADC input AND the gain and offset accuracies of the ADC back to the input ‘channel 1’. Can you help? Isn’t the accuracy of such a circuit described as “percent of reading ± offset”?
thanks