![SICK Visionary-B CV Operating Instructions Manual Download Page 66](http://html.mh-extra.com/html/sick/visionary-b-cv/visionary-b-cv_operating-instructions-manual_1264355066.webp)
3 D D R I V E R A S S I S TA N C E S Y S T E M | V I S I O N A R Y - B
8022960/ 2018-07-16| SICK AG
Subject to change without notice
6 6
PRODUCT DESCRIPTION
3.4
Status indicators
There is a 7-inch display.
The display is mounted in the driver's cab so that the driver can see the scene captured
by the sensor head.
It provides a real-time image of the scene along with information about the defined
alarm zones and the position of the sensor cable on the vehicle.
An object entering the alarm zone is visualized on the display by the color of the image
frame changing and indicated by an audible signal sounding from the built-in speaker.
The system is configured with the help of the display.
IMPORTANT!
Kit C is supplied with two displays.
3.5
Functionality
The driver assistance system works according to the principle of stereoscopy. This prin-
ciple of operation overlays two images of a scene captured from different angles.
The images are overlaid in order to generate a spatial reproduction of the scene with
suitable algorithms as it might be seen by the human eye with spatial vision.
Within this 3D detection area the location of objects can be identified in space and
then matched against the configured alarm zones.
If an object is located inside one of the two configured alarm zones, an alarm is output
via the display in the driver's cab and the driver can respond as appropriate.
3.5.1
Principle of operation
Stereoscopy is based on a pair of images being reconciled by a single processing unit.
In human beings, each of the two eyes supplies one half of the pair of images, which
are reconciled by the brain to create a single stereoscopic depth image. The (very
silghtly) diffrent viewing angles thus make it possible to see spatially and thus to allo
-
cate the distances.
This principle can be simulated by technology. In a driver assistance system, it is used
to determine how far away a vehicle is from objects in its path. The depth images is
taken by the sensor head and then processed by the evaluation unit to match the threat
treatment. During processing, 3D data is assigned to the scene in order to facilitate
checking against a virtual spatial zone (alarm zone).