© National Instruments Corporation
4-1
AT-MIO-16 User Manual
Chapter 4
Calibration Procedures
This chapter discusses the calibration procedures for the AT-MIO-16 analog input and analog
output circuitry.
The AT-MIO-16 is calibrated at the factory before shipment. To maintain the 12-bit accuracy of
the AT-MIO-16 analog input and analog output circuitry, check your board’s analog input with a
precise voltage source. If the board is out of calibration, then calibrate it. Otherwise, the board
does not need to be calibrated.
The AT-MIO-16 is factory calibrated in its factory-default configuration:
•
DIFF analog input mode
•
-10 to +10 V analog input range (bipolar)
•
-10 to +10 V analog output range (bipolar with internal reference selected)
Whenever you change your board configuration, recalibrate your AT-MIO-16 board.
Calibration Equipment Requirements
For best measurement results, the AT-MIO-16 needs to be calibrated so that its measurement
accuracy is within
±
0.012% of its input range (
±
1
/
2
LSB). According to standard practice, the
equipment you use to calibrate the AT-MIO-16 should be 10 times as accurate, that is, have
±
0.001% rated accuracy. Practically speaking, calibration equipment with four times the
accuracy of the item under calibration is generally considered acceptable. Four times the
accuracy of the AT-MIO-16 is 0.003%. You need the following equipment to calibrate the
AT-MIO-16 board:
•
For analog input calibration, you need a precision variable DC voltage source (usually a
calibrator) with these features:
–
Accuracy
±
0.001% standard
±
0.003% sufficient
–
Range
Greater than
±
10 V
–
Resolution
100
µ
V in
±
10 V range (5
1
/
2
digits)
•
For analog output calibration, you need a voltmeter with these features:
–
Accuracy
±
0.001% standard
±
0.003% sufficient
–
Range
Greater than
±
10 V
–
Resolution
100
µ
V in
±
10 V range (5
1
/
2
digits)