Basic Operation
872
SPNU563A – March 2018
Copyright © 2018, Texas Instruments Incorporated
Analog To Digital Converter (ADC) Module
(1)
The state of the switches in this table assumes that self-test mode is not enabled.
Table 22-2. Calibration Reference Voltages
(1)
CAL_EN
BRIDGE_EN
HILO
S1
S2
S3
S4
S5
Reference Voltage
1
0
0
1
0
1
0
0
(AD
REFHI
× R1 + AD
REFLO
× R2) / (R1 + R2)
1
0
1
0
1
0
1
0
(AD
REFLO
× R1 + AD
REFHI
× R2) / (R1 + R2)
1
1
0
0
1
1
0
0
AD
REFLO
1
1
1
1
0
0
1
0
AD
REFHI
0
X
X
0
0
0
0
1
V
in
When CAL_ST (ADCALCR.16) is set, a calibration conversion is started. The voltage source selected via
the bits BRIDGE_EN and HILO is converted once (single conversion mode) and the digital result is
returned to the calibration and correction register, ADCALR, where it can be read by the CPU. The
CAL_ST bit acts as a flag and must be polled by the CPU. It is held set during the conversion process and
automatically clears to indicate the end of the reference voltage conversion.
NOTE:
No Interrupt for end of calibration
The ADC does not generate an interrupt to signal the end of the calibration conversion. The
application must poll the CAL_ST bit to determine the end of the calibration conversion.
After the CAL_ST bit is set by the application program, it can only be reset by the end of the ongoing
conversion generated by the ADC core. If the calibration conversion is interrupted (CAL_EN bit is cleared),
the CAL_ST bit is held at 1 until a new calibration conversion has been set and completed. Setting the
CAL_ST bit while calibration is disabled (CAL_EN = 0) has no effect; however, in this situation, setting
CAL_EN immediately starts a calibration conversion. When the calibration conversion is interrupted by an
ADC_Enable (ADC_EN = 0, CAL_EN = 1, and CAL_ST = 1), a new conversion is automatically restarted
as soon as the ADC_Enable bit is released (ADC_EN = 1).
22.2.6.1.2 Calibration and Offset Error Correction Sequences
The number of measurements and the source to measure for an ADC calibration are application
dependent. The CAL_ST bit must be set for each calibration source to be measured. While calibration
mode is enabled, any available calibration sources can be converted according to the BRIDGE_EN and
HILO bits (see
). The digital results of the calibration measurements should be read from
ADCALR by the application after each reference conversion so that a correction value can be computed
and written back into ADCALR.
When the application has the necessary calibration data, it should compute the offset error correction
value and load it into the calibration and correction register, ADCALR. After the CAL_EN bit is cleared,
normal conversion mode restarts, continuing from where it was frozen, but with the addition of self-
correction data.
In normal mode, the self-correction system adds the correction value stored in ADCALR to each digital
result before it is written to the respective group’s FIFO.
The basic calibration routine is as follows:
1. Enable calibration via CAL_EN (ADCALCR.0).
2. Select the voltage source via BRIDGE_EN and HILO (ADCALCR.9:8).
3. Start the conversion with CAL_ST (ADCALCR.16).
4. Wait for CAL_ST to go to 0.
5. Get the results from ADCALR and save to memory.
6. Loop to step 2 until the calibration conversion data is collected for the desired reference voltages.
7. Compute the error correction value using calibration data saved in memory.
8. Load the ADCALR register with the 2s complement of the computed error correction value.
9. Disable calibration mode.