O & M Manual
Revision K (1/07)
- 44 -
Part 5 – Calibration
5.1 Overview
and
Methods
Calibration of the Q45C2 is required to accurately match the sensor
characteristics to the monitor/analyzer. Since the output of the conductivity
sensor does not degrade over time, it is typically only required that the sensor be
calibrated at initial installation and then cleaned periodically, to maintain proper
system accuracy.
Since the conductivity of a solution is greatly affected by temperature, proper
settings for thermal compensation are critical for accurate operation. Before
calibrating the instrument for the very first time, it is important to select the proper
operating parameters in the configuration menus for temperature compensation
methods.
When using conductivity calibration standards for a wet calibration, take care not
to inadvertently contaminate the reference solution; always thoroughly clean the
sensor, rinsing off in tap water, and then finish rinsing in pure or de-ionized
water. In addition, note that calibration solutions less than 20
µ
S can be very
unstable and are only accurate at the labeled reference temperature. Moving the
sensor back and forth between different value conductivity reference solutions
can quickly contaminate the solutions and render them inaccurate.
The system provides two methods of conductivity calibration: 1-point (wet
calibration) and cell constant. These two methods are significantly different. In
addition, a sensor zero-cal is used on initial installation to set the range zeros for
the sensor used. See Sections 5.11 through 5.12 for brief descriptions of their
uses.
5.11 1-Point Calibration Explained
The 1-point calibration method is generally known as the "grab sample"
calibration method. In the 1-point calibration method, the sensor may be
removed from the application and placed into a reference solution. It may also
be left in the measurement process and calibrated by reference. The 1-point
calibration adjusts the sensor slope to match the exact calibration point.
Readings beyond that point are then extrapolated from the determined slope of
the calibration line. Since the sensor slope does not degrade over time, frequent
re-calibration is unnecessary. Calibration accuracy can be optimized by
calibrating with a reference solution which is close to the values typically
measured.