03/11
AIME T500 Workstation
Artificial Intelligence
Machines
1. Device Description & Scope of Delivery
Device Description
The AIME T500 workstation series is designed for
deep learning development and has been optimized
for maximum performance with regard to low
volume development and the lowest possible energy
consumption without compromising on durability and
reliability.
All installed components were selected for their
energy efficiency, durability, compatibility and high
performance. They are perfectly balanced with each
other so that there are no performance bottlenecks.
With the sophisticated cooling system of the AIME
T500 series, the CPU and all built-in GPUs are cooled
highly effectively. High thermal conduction blocks
and the far higher cooling capabilities of liquid
compared to air keeps the operation temperature of
all components well below 60°C.
The secured low operation temperature reduces
stressful overheating for GPU and CPU silicons and
allows that all components keep operating at their
highest performance levels even under full load in
24/7 scenarios.
The large radiators in the front and on top of the
system are cooled by high durability noise reduced
fans with a guaranteed operation of more than 10
years MTBF. This setup keeps the system cooler,
more performant, durable and far less noisy than a
collection of many small fans for each component.
The AIME T500 enables you to train deep learning
models at up to four times the speed of a single GPU,
without throttling the computing power due to heat.
This means AI supercomputing performance with
over 400 trillion tensor FLOPS, which is equivalent to
the performance of hundreds of conventional
servers.
AIME workstations are delivered with a pre-installed
Linux operating system and are preconfigured with
the latest drivers and frameworks such as
Tensorflow, Keras, PyTorch and Mxnet.
The pre-installed AIME ML Container Manager offers
a comprehensive software stack, allowing
developers to comfortably set up AI projects and
navigate between frameworks and projects. The
libraries and GPU drivers required for the respective
deep learning framework are loaded in
preconfigured Docker containers and can be
configured and started with simple commands. The
most common frameworks - i.a. TensorFlow, Keras,
PyTorch and Mxnet - are pre-installed and ready to
use.
The AIME ML Container Manager makes life easier
for developers, so they don't have to worry about
problems installing the framework versions.
AIME customers log in immediately after unpacking
and start development right away with their
preferred deep learning framework.
Scope of Delivery
1x AIME Workstation T500 | T502 | T504
1x power cable for power supply (2x for T504)
(IEC connector on protective contact)
1x WLAN antenna
1x quick start guide with individual access data
1x technical specifications