Reset
Icon loader

HPC in Germany - Who? What? Where?

Refine by

Processor

Manufacturer

Generation

Model

Cores
Cores

 

GPU/ManyCore/FPGA

Manufacturer

Model

Count
Count

 

Memory
GB
GB
GB

 

Tags
TB
Gb/s
TB

 

Tags
Tags
41 HPC systems, 26 file systems, 3 archives at 20 locations

Center for Scientific Computing

Frankfurt am Main
Megware
GPU nodes MPP system Tier 2 hpc system

823 TFlop/s (Peak Performance) , 70 TB (Main Memory)

848 Nodes (Unknown QDR-Infiniband, FDR-Infiniband) , 18,960 CPU Cores (Intel, AMD)

700 GPGPUs (AMD)

Learn more

Filesystems
CSC-LOEWE (FhGFS)
764 TB
Clustervision
MPP system Tier 3 hpc system

41 TFlop/s (Peak Performance) , 18 TB (Main Memory)

358 Nodes (Unknown QDR-Infiniband) , 6,456 CPU Cores (AMD)

Learn more

Filesystems
FUCHS (FhGFS)
600 TB

Deutscher Wetterdienst

Offenbach
Cray XC40
MPP system Tier 2 hpc system

1.073 TFlop/s (Peak Performance) , 125 TB (Main Memory)

976 Nodes (Cray Aries) , 29,952 CPU Cores (Intel)

Learn more

Cray XC40
Internal MPP system Tier 2 hpc system

1.073 TFlop/s (Peak Performance) , 125 TB (Main Memory)

976 Nodes (Cray Aries) , 29,952 CPU Cores (Intel)

Learn more

Deutsches Elektronen Synchrotron

Hamburg
GPU nodes MPP system Tier 3 hpc system

37 TFlop/s (Peak Performance)

43 Nodes , 2,416 CPU Cores (Intel, AMD)

14 GPGPUs (Nvidia)

Learn more

Tier 3 hpc system

16 TFlop/s (Peak Performance)

143 Nodes , 2,288 CPU Cores (Intel)

Learn more

German Climate Computing Center

Hamburg
HLER-3 "Mistral" im Vollausbau
Bull bullx DLC B720/B725, bullx B720
GPU nodes MPP system Tier 2 hpc system

3.590 TFlop/s (Peak Performance) , 266 TB (Main Memory)

3,336 Nodes (Unknown FDR-Infiniband, Mellanox FDR-Infiniband) , 101,196 CPU Cores (Intel)

42 GPGPUs (Nvidia)

Learn more

Archives
HPSS - High Performance Storage System (StorageTEK SL8500)
190000 TB

Competence Center High Performance Computing (CC-HPC)

Kaiserslautern
Dell PowerEdge
GPU nodes MPP system SMP nodes Tier 3 hpc system

67 TFlop/s (Peak Performance) , 14 TB (Main Memory)

198 Nodes (Unknown FDR-Infiniband) , 3,224 CPU Cores (Intel)

2 GPGPUs (Nvidia)

Learn more

Filesystems
Beehive (BeeGFS)
240 TB
GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes

35 TFlop/s (Peak Performance) , 6 TB (Main Memory)

90 Nodes (Unknown FDR-Infiniband, QDR-Infiniband) , 1,584 CPU Cores (Intel)

3 GPGPUs (Nvidia) , 2 Many Core Processors (Intel)

Learn more

Filesystems
Seislab (BeeGFS)
600 TB
Tier 3 hpc system

23 TB (Main Memory)

1,450 Nodes , 11,600 CPU Cores (Intel)

Learn more

Gesellschaft für wissenschaftliche Datenverarbeitung mbH Göttingen

Göttingen
Clustervision Compute Cluster, Sysgen GPU Cluster, Transtec Compute Cluster, Unknown Delta Computer Compute Cluster
GPU nodes MPP system Tier 3 hpc system

401 TFlop/s (Peak Performance) , 82 TB (Main Memory)

661 Nodes (Unknown QDR-Infiniband, FDR-Infiniband) , 16,536 CPU Cores (Intel, AMD)

86 GPGPUs (Nvidia)

Learn more

MPP system Tier 2 hpc system

1.376 TFlop/s (Peak Performance) , 95 TB (Main Memory)

448 Nodes , 17,920 CPU Cores (Intel)

Learn more

Hochschulrechenzentrum

Darmstadt
IBM iDataPlex, NeXtScale
GPU nodes MPP system SMP nodes Tier 2 hpc system Xeon-Phi nodes

953 TFlop/s (Peak Performance) , 75 TB (Main Memory)

1,412 Nodes (Unknown FDR10-Infiniband, FDR-Infiniband) , 27,984 CPU Cores (Intel)

94 GPGPUs (Nvidia) , 52 Many Core Processors (Intel)

Learn more

Filesystems
Global (GPFS)
1536 TB
Global (NFS)
500 TB
Clustervision
MPP system Tier 3 hpc system

16 TFlop/s (Peak Performance) , 3 TB (Main Memory)

32 Nodes (Unknown QDR-Infiniband) , 1,536 CPU Cores (AMD)

Learn more

Filesystems
Global (GPFS)
1536 TB
Global (NFS)
500 TB

Höchstleistungsrechenzentrum Stuttgart

Stuttgart
Cray XC40
MPP system Tier 1 hpc system

7.420 TFlop/s (Peak Performance) , 987 TB (Main Memory)

7,712 Nodes (Cray Aries) , 185,088 CPU Cores (Intel)

Learn more

IT Center of RWTH Aachen University

Aachen
NEC HPC1812-Rg-2
GPU nodes MPP system SMP nodes Tier 2 hpc system

678 TFlop/s (Peak Performance) , 88 TB (Main Memory)

633 Nodes (Intel Omnipath) , 16,152 CPU Cores (Intel)

20 GPGPUs (Nvidia)

Learn more

Filesystems
Global (Lustre)
1500 TB
Global (NFS)
1500 TB
CLAIX-2018
GPU nodes MPP system SSD storage Tier 2 hpc system

3.559 TFlop/s (Peak Performance) , 251 TB (Main Memory)

1,307 Nodes (Intel Omnipath) , 62,736 CPU Cores (Intel)

96 GPGPUs (Nvidia)

Learn more

Filesystems
Global (Lustre)
1500 TB
Global (NFS)
1500 TB

Jülich Supercomputing Centre (JSC)

Jülich
Supercomputer JUWELS am Jülich Supercomputing Centre
Atos BullSequana X1000
GPU nodes MPP system SMP nodes Tier 1 hpc system

12.000 TFlop/s (Peak Performance) , 286 TB (Main Memory)

2,575 Nodes , 123,088 CPU Cores (Intel)

196 GPGPUs (Nvidia)

Learn more

Konrad-Zuse-Zentrum für Informationstechnik Berlin

Berlin-Dahlem
Cray XC30/XC40
MPP system Tier 2 hpc system

1.426 TFlop/s (Peak Performance) , 120 TB (Main Memory)

1,872 Nodes (Cray Aries) , 44,928 CPU Cores (Intel)

Learn more

Filesystems
Global (Lustre)
3700 TB
Global (GPFS)
500 TB
Archives
Archivspeicher (StorageTEK SL8500)
25000 TB
Cray XC40
SSD storage Tier 3 hpc system

244 TFlop/s (Peak Performance) , 8 TB (Main Memory)

80 Nodes (Cray Aries) , 5,440 CPU Cores (Intel)

Learn more

Filesystems
Global (Lustre)
3700 TB
Global (GPFS)
500 TB
Cray TDS (Cray XC40 DataWarp I/O Accelerator)
32 TB
Archives
Archivspeicher (StorageTEK SL8500)
25000 TB

Leibniz-Rechenzentrum der Bayerischen Akademie der Wissenschaften

Garching bei München
IBM NeXtScale
MPP system Tier 1 hpc system Xeon-Phi nodes

3.580 TFlop/s (Peak Performance) , 197 TB (Main Memory)

3,072 Nodes (Unknown FDR-Infiniband) , 86,016 CPU Cores (Intel)

Learn more

Lenovo ThinkSystem SD 650 DWC
MPP system SMP nodes Tier 1 hpc system

26.900 TFlop/s (Peak Performance) , 719 TB (Main Memory)

6,480 Nodes (Intel Omnipath) , 311,040 CPU Cores (Intel)

Learn more

Max Planck Computing & Data Facility

Garching
HPC System COBRA
Atos Intel Compute Module HNS2600BPB24, Lenovo
MPP system SMP nodes Tier 2 hpc system

12.720 TFlop/s (Peak Performance) , 530 TB (Main Memory)

3,424 Nodes (Intel Omnipath) , 136,960 CPU Cores (Intel)

368 GPGPUs (Nvidia)

Learn more

Paderborn Center for Parallel Computing

Paderborn
Clustervision CV-AIRE5
GPU nodes MPP system SMP system Tier 3 hpc system

240 TFlop/s (Peak Performance) , 47 TB (Main Memory)

616 Nodes (Mellanox QDR-Infiniband) , 9,920 CPU Cores (Intel)

48 GPGPUs (Nvidia)

Learn more

Filesystems
OCuLUS (FhGFS)
500 TB
Arbor Workstation 2HE C612 INTEL
FPGA nodes Tier 3 hpc system

512 GB (Main Memory)

8 Nodes , 32 CPU Cores (Intel)

16 FPGAs (Alpha Data)

Learn more

Cray CS500
FPGA nodes MPP system Tier 2 hpc system

835 TFlop/s (Peak Performance) , 52 TB (Main Memory)

272 Nodes (Intel Omnipath) , 10,880 CPU Cores (Intel)

32 FPGAs (Bittware)

Learn more

Filesystems
Scratch (ClusterStor)
720 TB

Regionales Hochschulrechenzentrum Kaiserslautern (RHRK)

Kaiserslautern
Fujitsu PRIMERGY XC250/400
GPU nodes MPP system SMP nodes Tier 3 hpc system Xeon-Phi nodes

134 TFlop/s (Peak Performance) , 17 TB (Main Memory)

319 Nodes (Unknown QDR-Infiniband) , 5,624 CPU Cores (Intel)

29 GPGPUs (Nvidia) , 6 Many Core Processors (Intel)

Learn more

Filesystems
Elwetritsch (BeeGFS)
1285 TB
Elwetritsch (NFS)
10 TB
HPC-Cluster Elwetritsch II der TUK
Dell PowerEdge R, NEC HPC1812Rh-1, Nvidia DGX-2
GPU nodes GPU system MPP system SMP nodes SSD storage Tier 3 hpc system Xeon-Phi nodes

52 TB (Main Memory)

486 Nodes (Unknown QDR-Infiniband, Intel Omnipath) , 10,400 CPU Cores (Intel, AMD)

32 GPGPUs (Nvidia)

Learn more

Filesystems
Elwetritsch (BeeGFS)
1285 TB
Elwetritsch (NFS)
10 TB

Erlangen Regional Computing Center

Erlangen
HPC Cluster Emmy @RRZE
NEC LX-2400
GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes

232 TFlop/s (Peak Performance) , 36 TB (Main Memory)

560 Nodes (Unknown QDR-Infiniband) , 11,200 CPU Cores (Intel)

16 GPGPUs (Nvidia) , 16 Many Core Processors (Intel)

Learn more

Filesystems
Emmy (Lustre)
430 TB
Global (NFS)
5 TB
Archives
Vault (GPFS/TSM-HSM)
2500 TB
Megware Linux-Cluster
MPP system Tier 3 hpc system

511 TFlop/s (Peak Performance) , 47 TB (Main Memory)

728 Nodes (Intel Omnipath) , 14,560 CPU Cores (Intel)

Learn more

Filesystems
Global (NFS)
5 TB
Meggie (Lustre)
850 TB
Archives
Vault (GPFS/TSM-HSM)
2500 TB
Sysgen GPU-Nodes I, GPU-Nodes II, GPU-Nodes III, GPU-Nodes V, Nvidia GPU-Nodes IV
GPU nodes Tier 3 hpc system

3 TB (Main Memory)

37 Nodes , 536 CPU Cores (Intel)

134 GPGPUs (Nvidia)

Learn more

Filesystems
Global (NFS)
5 TB
Archives
Vault (GPFS/TSM-HSM)
2500 TB
NEC Tsubasa Accelerator
NEC A300-2
Internal Tier 3 hpc system VE nodes

96 GB (Main Memory)

12 CPU Cores (Intel)

2 GPGPUs (NEC)

Learn more

Filesystems
Global (NFS)
5 TB
Archives
Vault (GPFS/TSM-HSM)
2500 TB
Hewlett Packard Enterprise (HPE) Apollo 70
ARM nodes Internal Tier 3 hpc system

128 GB (Main Memory)

64 CPU Cores (ARM)

Learn more

Filesystems
Global (NFS)
5 TB
Archives
Vault (GPFS/TSM-HSM)
2500 TB

Regionales Rechenzentrum der Universität zu Köln

Köln
Bull bullx S
MPP system SMP nodes Tier 3 hpc system

100 TFlop/s (Peak Performance) , 36 TB (Main Memory)

841 Nodes (Unknown QDR-Infiniband) , 9,712 CPU Cores (Intel)

Learn more

Filesystems
CHEOPS (Lustre)
500 TB

Steinbuch Centre for Computing

Eggenstein-Leopoldshafen
Megware MiriQuid (FH I), Lenovo NeXtScale (FH II)
MPP system SMP nodes Tier 2 hpc system

1.171 TFlop/s (Peak Performance) , 136 TB (Main Memory)

1,701 Nodes (Unknown FDR-Infiniband, EDR-Infiniband) , 34,800 CPU Cores (Intel)

84 GPGPUs (Nvidia)

Learn more

Filesystems
Global (Lustre)
6553 TB
Global (Lustre)
2078 TB
Megware bwUniCluster, bwUniCluster-Erweiterung
Internal MPP system SMP nodes Tier 3 hpc system

444 TFlop/s (Peak Performance) , 86 TB (Main Memory)

872 Nodes (Unknown FDR-Infiniband) , 18,304 CPU Cores (Intel)

Learn more

Filesystems
Global (Lustre)
6553 TB
Global (Lustre)
2078 TB

Zentrum für Datenverarbeitung

Mainz
Megware
GPU nodes MPP system Tier 2 hpc system

379 TFlop/s (Peak Performance) , 90 TB (Main Memory)

570 Nodes (Unknown QDR-Infiniband) , 35,760 CPU Cores (Intel, AMD)

52 GPGPUs (Nvidia) , 8 Many Core Processors (Intel)

Learn more

Filesystems
Lustre (Lustre)
7394 TB
Transtec
MPP system Tier 3 hpc system

106 TFlop/s (Peak Performance) , 10 TB (Main Memory)

320 Nodes (Unknown QDR-Infiniband) , 5,120 CPU Cores (Intel)

Learn more

Megware MiriQuid, NEC NEC Cluster, Intel NEC Cluster
MPP system SSD storage Tier 2 hpc system

3.016 TFlop/s (Peak Performance) , 194 TB (Main Memory)

1,948 Nodes (Intel Omnipath) , 52,248 CPU Cores (Intel)

188 GPGPUs (Nvidia, NEC)

Learn more

Filesystems
Mogon (GPFS)
1024 TB
Lustre (Lustre)
7394 TB

Center for Information Services and High Performance Computing

Dresden
Bull bullx B500/B515/710, bullx DLC B720/R400, Intel H2312XXLR2/HNS7200APX, IBM AC922
GPU nodes MPP system SMP nodes SSD storage Tier 2 hpc system Xeon-Phi nodes

2.087 TFlop/s (Peak Performance) , 157 TB (Main Memory)

2,117 Nodes (Unknown FDR-Infiniband, Intel Omnipath) , 48,360 CPU Cores (Intel, IBM)

408 GPGPUs (Nvidia)

Learn more

Filesystems
Global (Lustre)
6600 TB
Taurus (Lustre)
44 TB
SGI UV 2000
SMP system Tier 3 hpc system

11 TFlop/s (Peak Performance) , 8 TB (Main Memory)

512 CPU Cores (Intel)

Learn more

Filesystems
Global (Lustre)
6600 TB
Venus (Lustre)
60 TB