![ATI Technologies Q45CT Скачать руководство пользователя страница 27](http://html1.mh-extra.com/html/ati-technologies/q45ct/q45ct_o-and-m-manual_2998770027.webp)
27
O&M Manual
Rev-G (7/15)
Part 5 – Calibration
5.1
Overview and Methods
Calibration of the Q45CT is required to accurately match the sensor
characteristics to the monitor/analyzer. Since the output of the conductivity
sensor does not degrade over time, it is typically only required that the sensor be
calibrated at initial installation and then cleaned periodically to maintain proper
system accuracy.
It is important for the user to establish a periodic cleaning and calibration-check
schedule for sensor maintenance to maintain high system accuracy.
Since the conductivity of a solution is greatly affected by temperature, proper
settings for thermal compensation are critical for accurate operation. Before
calibrating the instrument for the very first time, it is important to select the proper
operating parameters in the configuration menus for temperature compensation
methods. Also at initial installation, a temperature calibration must be performed
before conductivity can be calibrated.
When using conductivity calibration standards for a wet calibration, take care not
to inadvertently contaminate the reference solution; always thoroughly clean the
sensor, rinsing off in tap water, and then finish rinsing in pure or de-ionized
water. In addition, note that calibration solutions less than 200
μS or greater than
100 mS can be very unstable. Moving the sensor back and forth between
different value conductivity reference solutions can quickly contaminate the
solutions and render them inaccurate.
The system provides two methods of conductivity calibration: 1-point (wet
calibration) and cell constant. These two methods are significantly different. In
addition, a sensor zero-cal is used on initial installation to set the range zeros for
the sensor used. See Sections 5.11 through 5.12 for brief descriptions of their
uses.
5.11 1-Point Calibration Explained
The 1-point calibration method is generally known as the "grab sample"
calibration method. In the 1-point calibration method, the sensor may be
removed from the application and placed into a reference solution. It may
also be left in the measurement process and calibrated by reference. The
1-point calibration adjusts the sensor slope to match the exact calibration
point. Readings beyond that point are then extrapolated from the
determined slope of the calibration line. Since the sensor slope does not
degrade over time, frequent re-calibration is unnecessary. Calibration
accuracy can be optimized by calibrating with a reference solution which is
close to the values typically measured.