![AIME T600 User Manual Download Page 3](http://html.mh-extra.com/html/aime/t600/t600_user-manual_2871937003.webp)
AIME T600 Workstation
Artificial Intelligence
Machines
3/13
1. Device Description & Scope of Delivery
Device Description
Designed for deep learning development, the AIME
T600 workstation series is optimized for the highest
possible performance with regard to low noise deve-
lopment and the lowest possible power consumption
without compromising on longevity and reliability.
All installed components were selected for their
energy efficiency, durability, compatibility and high
performance. They are perfectly balanced with each
other so that there are no performance bottlenecks.
The elaborated cooling system of the AIME T600 co-
vers the CPU with liquid cooling and the GPUs with a
multi GPU compatible air cooling concept supporting
two whisper silent tripple fan GPUs or up to four high
end turbo blower style fan GPUs.
CPU and GPUs directly exhaust the hot air outside of
the case to prevent building up heat inside the case.
This prevents overheating of the GPU array and
maintain high performance under full load in 24/7
scenarios.
The case is designed for maximum air intake suppor-
ted by powerful and temperature controlled high air
flow fans. The CPU is cooled by a closed AIO water
loop to reduce its impact as additional heat source.
The large radiators in the front and on top of the sys-
tem are cooled by high durability noise reduced fans
with a guaranteed operation of more than 10 years
MTBF. This setup keeps the system cooler, more per-
formant, durable and far less noisy than a collection
of many small fans for each component.
The AIME T600 enables you to train deep learning
models at up to four times the speed of a single GPU,
without throttling the computing power due to heat.
This means AI supercomputing performance with
over 2000 tera tensor FLOPS, which is equivalent to
the performance of hundreds of conventional ser-
vers.
AIME workstations are delivered with a pre-installed
Linux operating system and are preconfigured with
the latest drivers and frameworks such as Tensor-
flow, Keras, PyTorch and MXNet.
The pre-installed AIME ML Container Manager offers
a comprehensive software stack, allowing devel-
opers to comfortably set up AI projects and navigate
between frameworks and projects. The libraries and
GPU drivers required for the respective deep lear-
ning framework are loaded in preconfigured Docker
containers and can be configured and started with
simple commands. The most common frameworks -
i.a. TensorFlow, Keras, PyTorch and MXNet - are pre-
installed and ready to use.
The AIME ML Container Manager makes life easier
for developers, so they don't have to worry about
problems installing the framework versions.
AIME customers log in immediately after unpacking
and start development right away with their prefer-
red deep learning framework.
Scope of Delivery
1x AIME Workstation T600
1x power cable for power supply (2x for 4x GPU
configuration), EC connector on protective
contact
1x WLAN antenna
1x quick start guide with individual access data
1x technical specifications