5 HP 4x DDR IB Switch Module for c-Class BladeSystems
Overview
The 4x DDR IB switch module is a double-wide switch module for the HP BladeSystem c-Class
enclosure. It is based on the Mellanox 24-port InfiniScale III 4x DDR InfiniBand switch chip.
When an IB mezzanine HCA is plugged into the c-Class server blade, the mezzanine HCA is
connected to the IB switch through the mid-plane in the c-Class enclosure. For more information
on the c-Class enclosure, see the Servers and Workstations Overview, or go to the HP BladeSystem
web page at
http://h71028.www7.hp.com/enterprise/cache/80316-0-0-225-121.aspx
. For more
information on the 4x DDR IB Mezzanine HCA, see
Section 8.9.2 (page 105)
.
The 4x DDR IB switch module provides 24 InfiniBand 4x DDR ports with 20 Gb/s port-to-port
connectivity. The ports are arranged as 16 downlinks to connect up to 16 blade servers in the
enclosure, and eight uplinks to connect to the external InfiniBand switches to build an InfiniBand
fabric. All links conform to InfiniBand Trade Association (IBTA) specifications.
Voltaire Grid Switch family products come with the GridVision fabric and device manager
software stack running on an embedded processor on the internally managed switch. The
GridVision provides comprehensive and powerful management capabilities, delivering real-time
proactive management by providing the following:
•
Aggregated fabric and resource views
•
Access to a suite of fabric and switch diagnostics
•
Fail-over management on all levels
•
Provisioning of InfiniBand fabrics and the attached server
•
Networking and storage resources
The management capabilities can be accessed through the command line interface (CLI), graphical
user interface (GUI) or simple network management protocol (SNMP) managers, or in-band
through InfiniBand (IPoIB).
The GridVision software is being ported by Voltaire to run on server processors to enable an
InfiniBand cluster of HP BladeSystem c-Class server blades without requiring any internally
managed rack-mount IB switch. OpenSM is not supported in HP Cluster Platform solutions.
With the InfiniBand technology, the 4x DDR IB switch module runs at a signal rate of 20 Gb/s
and a data rate of 16 Gb/s in each direction. The MPI ping-pong latency is expected to be around
three to four microseconds. The actual performance will depend on the specific configuration.
5.1 Installing the 4x DDR IB Switch Module
The 4x DDR IB switch module is designed to fit into the double-wide switch bays on the c-Class
enclosures. Depending on the mezzanine connectors used for the 4x DDR IB Mezzanine HCA,
the 4x DDR IB switch module must be inserted into switch bays 3 and 4, 5 and 6, or 7 and 8. Refer
to the Servers and Workstations Overview and the 4x DDR IB Switch Module Installation Instructions
for information on how to install the 4x DDR IB switch module in the c-Class BladeSystem
enclosure.
5.1 Installing the 4x DDR IB Switch Module
69
Содержание Cluster Platform Express v2010
Страница 10: ...10 ...
Страница 18: ...18 ...
Страница 28: ...28 ...
Страница 38: ...38 ...
Страница 68: ...68 ...
Страница 92: ...92 ...
Страница 106: ...106 ...
Страница 110: ...110 ...
Страница 116: ...116 ...
Страница 122: ...122 ...
Страница 124: ...124 ...
Страница 132: ...132 ...