Reset
Icon loader

HPC in Germany - Who? What? Where?

Refine by

Processor

Manufacturer

Generation

Model

Cores
Cores

 

GPU/ManyCore/FPGA

Manufacturer

Model

Count
Count

 

Memory
GB
GB
GB

 

Tags
TB
Gb/s
TB

 

Tags
Tags
43 HPC systems, 39 file systems, 5 archives at 21 locations

Center for Scientific Computing

Frankfurt am Main
Megware
GPU nodes MPP system Tier 2 hpc system

823 TFlop/s (Peak Performance) , 70 TB (Main Memory)

848 Nodes (Unknown QDR-Infiniband, FDR-Infiniband) , 18,960 CPU Cores (Intel, AMD)

700 GPGPUs (AMD)

Learn more

Filesystems
CSC-LOEWE (FhGFS)
764 TB
Clustervision
MPP system Tier 3 hpc system

41 TFlop/s (Peak Performance) , 18 TB (Main Memory)

358 Nodes (Unknown QDR-Infiniband) , 6,456 CPU Cores (AMD)

Learn more

Filesystems
FUCHS (FhGFS)
600 TB

Deutscher Wetterdienst

Offenbach
Cray XC40
MPP system Tier 2 hpc system

1.073 TFlop/s (Peak Performance) , 125 TB (Main Memory)

976 Nodes (Cray Aries) , 29,952 CPU Cores (Intel)

Learn more

Cray XC40
Internal MPP system Tier 2 hpc system

1.073 TFlop/s (Peak Performance) , 125 TB (Main Memory)

976 Nodes (Cray Aries) , 29,952 CPU Cores (Intel)

Learn more

Deutsches Elektronen Synchrotron

Hamburg
Tier 3 hpc system

16 TFlop/s (Peak Performance)

143 Nodes , 2,288 CPU Cores (Intel)

Learn more

Unknown -
GPU nodes MPP system Tier 3 hpc system

402 TB (Main Memory)

764 Nodes , 26,732 CPU Cores (Intel, AMD)

282 GPGPUs (Nvidia)

Learn more

German Climate Computing Center

Hamburg
HLER-3 "Mistral" im Vollausbau
Bull bullx DLC B720/B725, bullx B720
GPU nodes MPP system Tier 2 hpc system

3.590 TFlop/s (Peak Performance) , 266 TB (Main Memory)

3,336 Nodes (Unknown FDR-Infiniband, Mellanox FDR-Infiniband) , 101,196 CPU Cores (Intel)

42 GPGPUs (Nvidia)

Learn more

Archives
HPSS - High Performance Storage System (StorageTEK SL8500)
190000 TB

Competence Center High Performance Computing (CC-HPC)

Kaiserslautern
Dell PowerEdge
GPU nodes MPP system SMP nodes Tier 3 hpc system

67 TFlop/s (Peak Performance) , 14 TB (Main Memory)

198 Nodes (Unknown FDR-Infiniband) , 3,224 CPU Cores (Intel)

2 GPGPUs (Nvidia)

Learn more

Filesystems
Beehive (BeeGFS)
240 TB
GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes

35 TFlop/s (Peak Performance) , 6 TB (Main Memory)

90 Nodes (Unknown FDR-Infiniband, QDR-Infiniband) , 1,584 CPU Cores (Intel)

3 GPGPUs (Nvidia) , 2 Many Core Processors (Intel)

Learn more

Filesystems
Seislab (BeeGFS)
600 TB
Tier 3 hpc system

23 TB (Main Memory)

1,450 Nodes , 11,600 CPU Cores (Intel)

Learn more

Gesellschaft für wissenschaftliche Datenverarbeitung mbH Göttingen

Göttingen
Sysgen GPU Cluster, Compute Cluster, Transtec Compute Cluster, DELTA Computer Compute Cluster, Atos Intel PCSD, Clustervision Compute Cluster
GPU nodes MPP system Tier 3 hpc system

2.883 TFlop/s (Peak Performance) , 92 TB (Main Memory)

402 Nodes (Unknown QDR-Infiniband, FDR-Infiniband, Intel Omnipath, FDR-Infiniband) , 16,640 CPU Cores (Intel, AMD)

278 GPGPUs (Nvidia) , 302 Applications (454 Versions)

Learn more

Filesystems
Bioinformatics WORK (BeeGFS-Storage-Server)
243 TB
MPS WORK (Storage Server)
140 TB
SCC HOME (StorNext - LAN Client)
1900 TB
SCC HOME (Stor Next - NFS)
900 TB
SCC scratch2 (Storage Server)
110 TB
SCC scratch (MDC) (Storage Server)
1865 TB
Archives
SCC Archivspeicher (Scalar i6000)
22000 TB
MPP system Tier 2 hpc system

8.261 TFlop/s (Peak Performance) , 498 TB (Main Memory)

1,473 Nodes (Intel Omnipath) , 116,152 CPU Cores (Intel)

12 GPGPUs (Nvidia)

Learn more

Filesystems
HLRN-IV Work (EXAScaler)
9412 TB
HLRN-IV Home (GRIDScaler)
340 TB
Archives
HLRN Archivspeicher (Scalar i6000)
7037 TB

Hochschulrechenzentrum

Darmstadt
Intel Cascadelake
GPU nodes SMP nodes Tier 2 hpc system

3.148 TFlop/s (Peak Performance) , 251 TB (Main Memory)

643 Nodes ( HDR100-Infiniband) , 61,824 CPU Cores (Intel, AMD)

56 GPGPUs (Nvidia)

Learn more

Filesystems
GLOBAL (DSS)
4000 TB

Höchstleistungsrechenzentrum Stuttgart

Stuttgart
HPC-System Hawk
Hewlett Packard Enterprise (HPE) Apollo
Tier 1 hpc system

26.000 TFlop/s (Peak Performance) , 1 TB (Main Memory)

5,632 Nodes (Unknown HDR200-Infiniband) , 720,896 CPU Cores (AMD)

Learn more

Cray CS-Storm, CS500
GPU nodes Tier 3 hpc system

9 TB (Main Memory)

16 Nodes , 608 CPU Cores (Intel)

64 GPGPUs (Nvidia)

Learn more

NEC
GPU system Tier 3 hpc system

102 TB (Main Memory)

466 Nodes , 13,856 CPU Cores (Intel)

81 GPGPUs (Nvidia, AMD)

Learn more

IT Center of RWTH Aachen University

Aachen
NEC HPC1812-Rg-2
GPU nodes MPP system SMP nodes Tier 2 hpc system

678 TFlop/s (Peak Performance) , 88 TB (Main Memory)

633 Nodes (Intel Omnipath) , 16,152 CPU Cores (Intel)

20 GPGPUs (Nvidia)

Learn more

Filesystems
HPCWORK (Claix 16) (Lustre)
3000 TB
HOME / WORK (NFS)
1500 TB
HPCWORK (Claix 18) (Lustre)
10000 TB
HPC-System CLAIX-2018
GPU nodes MPP system SSD storage Tier 2 hpc system

4.965 TFlop/s (Peak Performance) , 251 TB (Main Memory)

1,307 Nodes (Intel Omnipath) , 62,736 CPU Cores (Intel)

96 GPGPUs (Nvidia)

Learn more

Filesystems
HPCWORK (Claix 16) (Lustre)
3000 TB
HOME / WORK (NFS)
1500 TB
HPCWORK (Claix 18) (Lustre)
10000 TB

Jülich Supercomputing Centre (JSC)

Jülich
Supercomputer JUWELS am Jülich Supercomputing Centre
Atos BullSequana X1000
GPU nodes MPP system SMP nodes Tier 1 hpc system

12.000 TFlop/s (Peak Performance) , 286 TB (Main Memory)

2,575 Nodes , 123,088 CPU Cores (Intel)

196 GPGPUs (Nvidia)

Learn more

Zuse-Institut Berlin

Berlin-Dahlem
Foto HPC-System Lise
Atos Bull Intel cluster
MPP system Tier 2 hpc system

7.907 TFlop/s (Peak Performance) , 455 TB (Main Memory)

1,146 Nodes (Intel Omnipath) , 110,016 CPU Cores (Intel)

Learn more

Filesystems
HLRN-IV WORK (DDN EXAScaler)
8192 TB
HLRN-IV HOME (DDN GRIDScaler)
340 TB
Archives
Archivspeicher (StorageTEK SL8500)
25000 TB

Leibniz-Rechenzentrum der Bayerischen Akademie der Wissenschaften

Garching bei München
IBM NeXtScale
MPP system Tier 1 hpc system Xeon-Phi nodes

3.580 TFlop/s (Peak Performance) , 197 TB (Main Memory)

3,072 Nodes (Unknown FDR-Infiniband) , 86,016 CPU Cores (Intel)

Learn more

Lenovo ThinkSystem SD 650 DWC
MPP system SMP nodes Tier 1 hpc system

26.900 TFlop/s (Peak Performance) , 719 TB (Main Memory)

6,480 Nodes (Intel Omnipath) , 311,040 CPU Cores (Intel)

Learn more

Max Planck Computing & Data Facility

Garching
HPC System COBRA
Atos Intel Compute Module HNS2600BPB24, Lenovo
MPP system SMP nodes Tier 2 hpc system

12.720 TFlop/s (Peak Performance) , 530 TB (Main Memory)

3,424 Nodes (Intel Omnipath) , 136,960 CPU Cores (Intel)

368 GPGPUs (Nvidia)

Learn more

Paderborn Center for Parallel Computing

Paderborn
Arbor Workstation 2HE C612 INTEL
FPGA nodes Tier 3 hpc system

512 GB (Main Memory)

8 Nodes , 32 CPU Cores (Intel)

16 FPGAs (Alpha Data)

Learn more

Cray CS500
GPU nodes MPP system Tier 3 hpc system

835 TFlop/s (Peak Performance) , 53 TB (Main Memory)

274 Nodes (Intel Omnipath) , 10,960 CPU Cores (Intel)

18 GPGPUs (Nvidia)

Learn more

Filesystems
Scratch (Cray ClusterStor)
720 TB
Atos Bull Sequana XH2000, Nvidia DGX -A100
FPGA nodes GPU nodes MPP system Tier 2 hpc system

7.100 TFlop/s (Peak Performance) , 355 TB (Main Memory)

1,121 Nodes (Mellanox HDR100-Infiniband) , 143,488 CPU Cores (AMD)

136 GPGPUs (Nvidia) , 80 FPGAs (Bittware, AMD XILINX)

Learn more

Filesystems
Scratch (DDN Exascaler 7990X with NVMe accelerator)
6000 TB

Regionales Hochschulrechenzentrum Kaiserslautern-Landau (RHRZ)

Kaiserslautern
HPC-Cluster Elwetritsch II der TUK
Dell PowerEdge R, NEC HPC1812Rh-1, Nvidia DGX-2, DGX-1
GPU nodes GPU system MPP system SMP nodes SSD storage Tier 3 hpc system Xeon-Phi nodes

53 TB (Main Memory)

489 Nodes (Unknown QDR-Infiniband, Intel Omnipath) , 10,520 CPU Cores (Intel, AMD)

56 GPGPUs (Nvidia) , 56 Applications (228 Versions)

Learn more

Filesystems
Elwetritsch (BeeGFS)
1285 TB
Elwetritsch (NFS)
10 TB

Regionales Rechenzentrum der Universität zu Köln

Köln
Bull bullx S
MPP system SMP nodes Tier 3 hpc system

100 TFlop/s (Peak Performance) , 36 TB (Main Memory)

841 Nodes (Unknown QDR-Infiniband) , 9,712 CPU Cores (Intel)

Learn more

Filesystems
CHEOPS (Lustre)
500 TB

Scientific Computing Center

Karlsruhe
bwUniCluster 2.0 Stage 1
Hewlett Packard Enterprise (HPE) ProLiant, Lenovo ThinkSystem

159 TB (Main Memory)

837 Nodes ( HDR200-Infiniband) , 40,608 CPU Cores (Intel)

196 GPGPUs (Nvidia)

Learn more

HoreKa
Lenovo ThinkSystem

248 TB (Main Memory)

769 Nodes ( HDR200-Infiniband) , 58,444 CPU Cores (Intel)

668 GPGPUs (Nvidia)

Learn more

Steinbuch Centre for Computing

Eggenstein-Leopoldshafen

Zentrum für Datenverarbeitung

Mainz
Megware
GPU nodes MPP system Tier 2 hpc system

379 TFlop/s (Peak Performance) , 90 TB (Main Memory)

570 Nodes (Unknown QDR-Infiniband) , 35,760 CPU Cores (Intel, AMD)

52 GPGPUs (Nvidia) , 8 Many Core Processors (Intel)

Learn more

Filesystems
Lustre (Lustre)
7394 TB
Transtec
MPP system Tier 3 hpc system

106 TFlop/s (Peak Performance) , 10 TB (Main Memory)

320 Nodes (Unknown QDR-Infiniband) , 5,120 CPU Cores (Intel)

Learn more

Megware MiriQuid, NEC NEC Cluster, Intel NEC Cluster
MPP system SSD storage Tier 2 hpc system

3.125 TFlop/s (Peak Performance) , 194 TB (Main Memory)

1,948 Nodes (Intel Omnipath) , 52,248 CPU Cores (Intel)

188 GPGPUs (Nvidia, NEC)

Learn more

Filesystems
Mogon (GPFS)
1024 TB
Lustre (Lustre)
7394 TB

Center for Information Services and High Performance Computing

Dresden
IBM AC922, NEC , Hewlett Packard Enterprise (HPE) Superdome Flex
SMP nodes

786 TFlop/s (Peak Performance) , 156 TB (Main Memory)

225 Nodes (Mellanox HDR100-Infiniband) , 26,880 CPU Cores (IBM, AMD, Intel)

192 GPGPUs (Nvidia) , 415 Applications (848 Versions)

Learn more

NEC HPC 22S8Ri-4

35 TB (Main Memory)

34 Nodes ( HDR200-Infiniband) , 1,632 CPU Cores (AMD)

272 GPGPUs (Nvidia)

Learn more

Erlangen National Center for High Performance Computing (NHR@FAU)

Erlangen
HPC Cluster Emmy @RRZE
NEC LX-2400
GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes

232 TFlop/s (Peak Performance) , 36 TB (Main Memory)

556 Nodes (Mellanox QDR-Infiniband) , 11,088 CPU Cores (Intel)

20 GPGPUs (Nvidia) , 44 Applications (131 Versions)

Learn more

Filesystems
Emmy (Lustre)
430 TB
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Megware Linux-Cluster
MPP system Tier 3 hpc system

511 TFlop/s (Peak Performance) , 47 TB (Main Memory)

728 Nodes (Intel Omnipath) , 14,560 CPU Cores (Intel)

21 Applications (61 Versions)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Meggie (Lustre)
850 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Sysgen GPU-Nodes II, GPU-Nodes III, GPU-Nodes V, Supermicro 2124GQ-NART, Nvidia GPU-Nodes IV, TRADEX SYSTEMS Sp. z o.o. Gigabyte G481-H80
GPU nodes GPU system Tier 3 hpc system

5 TB (Main Memory)

45 Nodes , 1,392 CPU Cores (Intel, AMD)

208 GPGPUs (Nvidia) , 3 Applications (8 Versions)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
NEC Tsubasa Accelerator
NEC A300-2
Internal Tier 3 hpc system VE nodes

96 GB (Main Memory)

12 CPU Cores (Intel)

2 GPGPUs (NEC)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Hewlett Packard Enterprise (HPE) Apollo 70
ARM nodes Internal Tier 3 hpc system

128 GB (Main Memory)

64 CPU Cores (ARM)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Megware Supermicro X10DRH, RaidMedia Gigabyte H262-Z66

22 TB (Main Memory)

47 Nodes , 2,484 CPU Cores (Intel, AMD)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Megware GPGPU Cluster, GPU-Cluster
GPU nodes GPU system Tier 2 hpc system Tier 3 hpc system

65 TB (Main Memory)

70 Nodes (Mellanox HDR200-Infiniband, HDR200-Infiniband) , 8,960 CPU Cores (AMD)

560 GPGPUs (Nvidia)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Fritz (Lustre)
3700 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
Megware Linux-Cluster
MPP system Tier 2 hpc system Tier 3 hpc system

242 TB (Main Memory)

944 Nodes (Mellanox HDR100-Infiniband) , 67,968 CPU Cores (Intel)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Fritz (Lustre)
3700 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB