About the HPC system Draco

About the HPC system Draco

HPC extension cluster deployed in 2016

The extension cluster DRACO of the HPC system was installed in May 2016 at the MPCDF with Intel 'Haswell' Xeon E5-2698 processors (~ 880 nodes with 32 cores @ 2.3 GHz each). 106 of the nodes are equipped with accelerator cards (2 x PNY GTX980 GPUs each).

Most of the compute nodes have a main memory of 128 GB, 4 nodes have 512 GB, 1 has 256 GB, 4 of the GPU nodes have a main memory of 256 GB.

In January 2017, the DRACO cluster was expanded by 64 Intel 'Broadwell' nodes that were purchased by the Fritz-Haber Institute. The 'Broadwell' nodes have 40 cores each and a main memory of 256 GB.

In total there are 30.688 cores with a main memory of 128 TB and a peak performance of 1.12 PetaFlop/s.

In addition to the compute nodes there are 4 login nodes and 8 I/O nodes that serve the 1.5 PetaByte of disk storage.

The common interconnect is a fast InfiniBand FDR14 network.

The compute nodes and GPU nodes are bundled into 30 domains.
Within one domain, the InfiniBand network topology is a 'fat tree' topology for high efficient communication. The InfiniBand connection between the domains is much weaker, so batch jobs are restricted to a single domain, that is 32 nodes.

Go to Editor View