6.
Technical Data
6.1
Resolution and Accuracy (Definition)
Basically one has to differentiate between the resolution and the accuracy of a measuring system. The
two parameters are not directly interdependent and may differ from each other.
Resolution
The resolution of linear system describes the least possible displacement of scanning head against the
scale which can still be discerned by the evaluation electronics (display, control). It depends on (see
chart 1)
• the graduation period of the scale
•
the signal intepolation factor (internally or in auxiliary electronic unit)
• the evaluation mode in the counter
Accuracy
The accuracy of linear measuring systems is specified in accuracy classes.
The extreme error values for any max. one-meter section of the measured length lie within the specified
accuracy class of ±
a
μm with respect to their mean value.
For measuring length up to 1 m, the tolerance (±
a
μm) refers to the actual measuring lengths. The
accuracy applies to a reference temperature of 20°C�
With exposed linear measuring systems, the definition of the accuracy class applies only to the scale.
This is called scale accuracy�
Grating period
of scale tape
Signal period
of sinusoidal
signals
Interpolation
factor
Signal period
after
interpolation
Resolution after evaluation
in counter
2-times
4-times
20 µm
20 µm
none
20 µm
10 µm
5 µm
5-times
4 µm
2 µm
1 µm
10-times
2 µm
1 µm
0�5 µm
25-times
0�8 µm
0�4 µm
0�2 µm
50-times
0�4 µm
0�2 µm
0�1 µm
100-times
0�2 µm
0�1 µm
0�05 µm
Chart 1
14/58
KIT
L
-Series