3 Cabling Your HP Cluster Platform for HPCS
For HPCS to install and operate correctly, the GigE network must be cabled properly. The head
node must have two connections; one NIC connected to an enterprise (public) connection, and
the other NIC connected to an isolated (private) network infastructure. All compute nodes must
have the same NIC (NIC1 or NIC2) connected to the same isolated (private) network which
includes the head node private NIC.
NOTE:
HP recommends isolating the private network and not connecting any other enterprise
machines to this network.
Most configurations do not require any special network switch configurations or VLANing.
However, some Blade configurations including c3000 enclosures and/or mixed BL2x22X and
BL46X configurations with a BL4XX head node require additional switch configuration and
VLAN requirements to meet the basic public/private requirements stated above.
Due to the vast number of possible configurations, see the standard Cluster Platform cabling
requirements located at
www.docs.hp.com/en/highperfcomp.html
for more detail on how to
properly configure and cable your specific network configuration.
To cable your cluster system, follow these steps:
1.
Locate the appropriate cabling guides at:
http://www.docs.hp.com/en/highperfcomp.html
.
Cables are labeled with the origin and destination ports. Reference the cabling tables to
ensure that you understand the port naming syntax. In the event of missing or damaged
cable labels (or if replacing damaged cables), use the cabling tables for your model and type
of cluster to determine the correct point-to-point cable routing.
2.
When cabling the cluster, you might see cabling table footnotes that are
operating-environment specific. Follow all cabling instructions that apply to the HP XC
operating environment, which applies the same cabling rules as HPCS.
3.
Clusters are delivered preconfigured. For single-rack solutions, connect the cluster to your
local area network (LAN), using the Ethernet switch port or NIC card designated in the
cabling tables.
4.
For clusters of more than one rack, you must do the following:
a.
Complete the intrarack cabling, as described in the appropriate documentation for
InfiniBand or Gigabit Ethernet.
CAUTION:
InfiniBand cables and interconnect ports are highly sensitive and prone
to damage if not handled correctly.
b.
Connect the cluster to your LAN, using the Ethernet switch port or NIC card designated
in the cabling tables.
5.
Perform a visual check of the fabric, as described in the cluster hardware documentation,
referring to the user guides for the specific interconnect and Ethernet network switches used
in your cluster. (This step involves only verifying that the link status and activity LEDs at
each end of a cable link indicate that the link is good.)
6.
Power up the cluster, as described in the interconnect guide for your model of HP Cluster
Platform.
19