Appendix A
Specifications
A-4
©
National Instruments Corporation
Offset temperature coefficient
Pregain.............................................±15 µV/
°
C
Postgain ...........................................±100 µV/
°
C
Gain temperature coefficient ..................±40 ppm/
°
C
Explanation of Analog Input Specifications
Relative accuracy
is a measure of the linearity of an ADC. However,
relative accuracy is a tighter specification than a
nonlinearity
specification.
Relative accuracy indicates the maximum deviation from a straight line for
the analog-input-to-digital-output transfer curve. If an ADC has been
calibrated perfectly, this straight line is the ideal transfer function, and the
relative accuracy specification indicates the worst deviation from the ideal
that the ADC permits.
A relative accuracy specification of ±1 LSB is roughly equivalent to, but
not the same as, a ±0.5 LSB nonlinearity or integral nonlinearity
specification because relative accuracy encompasses both nonlinearity and
variable quantization uncertainty, a quantity often mistakenly assumed to
be exactly ±0.5 LSB. Although quantization uncertainty is ideally
±0.5 LSB, it can be different for each possible digital code and is actually
the analog width of each code. Thus, it is more specific to use relative
accuracy as a measure of linearity than it is to use what is normally called
nonlinearity, because relative accuracy ensures that the
sum
of quantization
uncertainty and A/D conversion error does not exceed a given amount.
Integral nonlinearity
(INL) in an ADC is an often ill-defined specification
that is supposed to indicate a converter’s overall A/D transfer linearity. The
manufacturer of the ADC chip National Instruments uses on the PCI-1200
specifies its integral nonlinearity by stating that the analog center of any
code will not deviate from a straight line by more than ±1 LSB. This
specification is misleading because, although a particularly wide code’s
center may be found within ±1 LSB of the ideal, one of its edges may be
well beyond ±1.5 LSB; thus, the ADC would have a relative accuracy of
that amount. National Instruments tests its boards to ensure that they meet
all three linearity specifications defined in this appendix.
Differential nonlinearity
(DNL) is a measure of deviation of code widths
from their theoretical value of 1 LSB. The width of a given code is the size
of the range of analog values that can be input to produce that code, ideally
1 LSB. A specification of ±1 LSB differential nonlinearity ensures that no
code has a width of 0 LSBs (that is, no missing codes) and that no code
width exceeds 2 LSBs.