
In the original firmware the actual calculation was:
display
V
= (ADC * Calibration Factor) / 256
If we insert some typical values, then the ADC value for 13.8V will be around 2629, and the
default calibration value is 135, and thus:
display
V
= (2629 * 135) / 256
= 354915 / 256
= 1386.386
which, when divided by 100 displays as:
13.86V
If the calibration factor were increased to 136, then the raw value would then be:
1396.656
which would display as:
13.96V
In other words, the quantisation step related to the calibration factor is 103mV, despite the
inherent accuracy of the measurement system being 5.25mV.
If the measurement system’s calibration factor and calculation is changed, the overall
accuracy and resolution can be significantly improved.
The new calculation algorithm is:
display
V
= (ADC * Calibration Factor) / 10,000,000
If we perform a worst-case analysis, then the lowest output voltage from the potential divider
will be when R27 is at its highest tolerance limit, and R28 at its lowest. In this case, the
nominal 13.8V input voltage will be scaled down to:
13.8 * 9900 / (9900 + 33330)
= 13.8 * 9900 / 43230
= 3.16031V
Equally, the highest output voltage will occur when R28 is at its highest tolerance limit and
R27 at its lowest. In this case, the nominal 13.8V input voltage will be scaled down to:
40