![Trans instruments WalkLAB HP 9010 Operation Manual Download Page 9](http://html1.mh-extra.com/html/trans-instruments/walklab-hp-9010/walklab-hp-9010_operation-manual_1150571009.webp)
8
7.
CALIBRATION:
7.1
Prepare standard solutions of at least 2 calibration points. If
measurements are to be made in the acidic range, select 2 points
between pH-2 to 7. If measurements are made in the alkaline
range, then select between pH7 to 16. If full range is required,
calibrate at least 3 to 5 calibration points between pH-2 to 16.
7.2
Buffer standard must be set during
SETUP
in page 6,
BuF
BuF
BuF
BuF
.
7.3
Standard solutions are in 3 groups of standards –
ISO, Nist
and
custom standards.
ISO
buffer standards are pH1.68, 7.00, 4.01, 10.01 and 12.45
NIST
buffer standards are pH1.68, 4.01, 6.86, 9.18 and 12.45
Custom standards are defines manually for each buffer standard.
7.4
Do keep in mind that buffer all standards are specifies at 25°C. Calibration
value should set to the exact value as the buffer’s solution value correlate to its
current temperature.
Calibration with ISO Standard
7.5
Make sure the unit is properly installed and electrode connected.
7.6
This meter can automatically recognize the
ISO
Standard buffer
solution. You will need at least 2 buffer solutions.
Always begin with
“Offset” calibration first
.
7.7
Rinse the pH electrodes and temperature probe in distilled-water
then dip it in the buffer solution, first in the offset buffer (pH7.00)
then subsequently each of the slope buffer solution.
7.8
Press
CAL
key for 2 seconds and display will show the
7.00
and
CAL
blinking, indicating it is in calibration mode.
* Calibration should be performed as frequently as possible
to ensure accurate measurement, depending on the
frequency of tests performed. Additional calibration
solution should be purchased for future needs
.
Always rinses the electrode with distilled water before and after each
test. This is to prevent solution carry over or cross contamination.
Standard solutions must maintain highest purity; otherwise the
meter’s accuracy could be compromised.