
Instruction Manual
D102748X012
DLC3010 Digital Level Controller
Configuration
May 2018
58
Calibration
Introduction: Calibration of Smart Instruments
Analog instruments generally have only one interface that can be calibrated by the user. A zero and span output
calibration is normally performed at the corresponding two input conditions. Zero/Span calibration is very simple to
use, but provides little versatility. If the 0% and 100% input conditions are not available to the user, a calibration can
sometimes be accomplished, but the gain and offset adjustments will likely interact, requiring considerable iteration
to achieve accuracy. In contrast, intelligent instruments have many interfaces that can be calibrated or scaled by the
user, with consequent increased versatility.
Refer to table 4‐5 for a list of relationships in the DLC3010 that can be calibrated or configured by the user. Note that
not all relationships are listed here.
Table 4‐5. Relationships in the FIELVUE DLC3010 that can be User Calibrated or Configured
Torque Tube Rate
The scale factor between the internal digital representation of the measured pilot shaft rotation and the physical torque
input to the sensor.
Zero Reference Angle
The angle of pilot shaft rotation associated with the zero buoyancy condition in Interface or Density mode, or the zero
differential buoyancy condition in Level mode. (The zero reference for the input of the PV calculation).
Driver Rod Length
The scale factor (moment arm) between a force input to the sensor driver rod and the torque developed as input to the
torque tube.
Displacer Volume
The scale factor relating the density of the process fluid to the maximum force that can be produced as an input to the
driver rod of the sensor.
SG
The density of the process fluid normalized to the density of water at reference conditions. The scale factor that
transforms displacer volume and measured buoyancy into a level signal normalized to displacer length.
Displacer Length
The scale factor to convert normalized level to level on the displacer in engineering units.
Level Offset
The zero reference for the output of the PV calculation, referred to the location of the bottom of the displacer.
URV (Upper Range Value)
The value of computed process variable at which a 20 mA output (100% Range) is desired.
LRV (Lower Range Value)
The value of computed process variable at which a 4 mA output (0% Range) is desired.
D/A Trim
The gain and offset of the D/A converter which executes the digital commands to generate output
Instrument Temperature Offset
Bias to improve the accuracy of the ambient temperature measurement used to provide temperature compensation for
the mechanical‐to‐electronic transducer.
Proc Temp Offset
Bias to improve the accuracy of the (RTD) temperature measurement used to provide compensation for
process‐temperature‐related density changes.
These parameters are factory‐set to the most common values for the 249 sensors. Therefore, for the bulk of units sold
in simple level applications, it is possible to accept the defaults and run a Min/Max Calibration. If any of the advanced
features of the instrument are to be used, accurate sensor and test fluid information should generally be entered
before beginning the calibration.
Primary
Guided Calibration
Field Communicator
Configure > Calibration > Primary > Guided Calibration (2-4-1-1)
Guided Calibration recommends an appropriate calibration procedures for use in the field or on the bench based on
your input. Answer questions about your process scenario to reach the calibration recommendation. When feasible,
the appropriate calibration method will be invoked from within the procedure.