Reset
Icon loader

HPC in Germany - Who? What? Where?

Refine by

Processor

Manufacturer

Generation

Model

Cores
Cores

 

GPU/ManyCore/FPGA

Manufacturer

Model

Count
Count

 

Memory
GB
GB
GB

 

Tags
TB
Gb/s
TB

 

Tags
Tags
59 HPC systems, 33 file systems, 4 archives at 35 locations

Rechenzentrum

Freiburg im Breisgau
GPU nodes Tier 3 hpc system

155.00 TiB (Main Memory)

256 Nodes , 38,848 CPU Cores (AMD, Intel)

52 GPGPUs (AMD, Nvidia)

Learn more

Fachgruppe Physik

Wuppertal
Megware
GPU nodes Tier 3 hpc system

78.00 TiB (Main Memory)

281 Nodes , 17,968 CPU Cores (AMD, Intel)

40 GPGPUs (Nvidia)

Learn more

Deutscher Wetterdienst

Offenbach
HPC-System LU
NEC SX Aurora Tsubasa A412, SX Aurora Tsubasa C401
Tier 2 hpc system VE nodes

16.578 TFlop/s (Peak Performance) , 319.00 TiB (Main Memory)

645 Nodes , 54,400 CPU Cores (NEC)

Learn more

HPC-System LU
NEC SX Aurora Tsubasa A412, SX Aurora Tsubasa C401
Tier 2 hpc system VE nodes

12.775 TFlop/s (Peak Performance) , 246.00 TiB (Main Memory)

497 Nodes , 41,920 CPU Cores (NEC)

Learn more

Deutsches Elektronen Synchrotron

Hamburg
GPU nodes Tier 3 hpc system

37 TFlop/s (Peak Performance)

43 Nodes , 2,416 CPU Cores (Intel, AMD)

14 GPGPUs (Nvidia)

Learn more

Tier 3 hpc system

16 TFlop/s (Peak Performance)

143 Nodes , 2,288 CPU Cores (Intel)

Learn more

HPC-System Maxwell
Unknown -
GPU nodes MPP system Tier 3 hpc system

2.321 TFlop/s (Peak Performance) , 393.00 TiB (Main Memory)

764 Nodes , 26,668 CPU Cores (Intel, AMD)

282 GPGPUs (Nvidia)

Learn more

Deutsches Klimarechenzentrum GmbH

Hamburg
HPC-System Levante
Atos BullSequana XH2000
GPU nodes MPP system SMP nodes Tier 2 hpc system

16.600 TFlop/s (Peak Performance) , 863.00 TiB (Main Memory)

3,042 Nodes , 389,376 CPU Cores (AMD)

240 GPGPUs (Nvidia)

Learn more

ZDV

Tübingen
GPU nodes Tier 3 hpc system

543 TFlop/s (Peak Performance) , 41.00 TiB (Main Memory)

302 Nodes , 6,120 CPU Cores (Intel)

124 GPGPUs (Nvidia)

Learn more

Competence Center High Performance Computing (CC-HPC)

Kaiserslautern
Dell PowerEdge
GPU nodes MPP system SMP nodes Tier 3 hpc system

67 TFlop/s (Peak Performance) , 14.00 TiB (Main Memory)

198 Nodes (Unknown FDR-Infiniband) , 3,224 CPU Cores (Intel)

2 GPGPUs (Nvidia)

Learn more

Filesystems
Beehive (BeeGFS)
240 TB
GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes

35 TFlop/s (Peak Performance) , 6.00 TiB (Main Memory)

90 Nodes (Unknown FDR-Infiniband, QDR-Infiniband) , 1,584 CPU Cores (Intel)

3 GPGPUs (Nvidia) , 2 Many Core Processors (Intel)

Learn more

Filesystems
Seislab (BeeGFS)
600 TB

Gesellschaft für wissenschaftliche Datenverarbeitung mbH Göttingen

Göttingen
Sysgen GPU Cluster, Compute Cluster, Transtec Compute Cluster, DELTA Computer Compute Cluster, Atos Intel PCSD, Clustervision Compute Cluster
GPU nodes MPP system Tier 3 hpc system

2.883 TFlop/s (Peak Performance) , 90.00 TiB (Main Memory)

402 Nodes (Unknown QDR-Infiniband, FDR-Infiniband, Intel Omnipath, FDR-Infiniband) , 16,640 CPU Cores (Intel, AMD)

278 GPGPUs (Nvidia) , 302 Applications (454 Versions)

Learn more

Filesystems
Bioinformatics WORK (BeeGFS-Storage-Server)
243 TB
MPS WORK (Storage Server)
140 TB
SCC HOME (StorNext - LAN Client)
1900 TB
SCC HOME (Stor Next - NFS)
900 TB
SCC scratch2 (Storage Server)
110 TB
SCC scratch (MDC) (Storage Server)
1865 TB
Archives
SCC Archivspeicher (Scalar i6000)
22000 TB
MPP system Tier 2 hpc system

8.261 TFlop/s (Peak Performance) , 487.00 TiB (Main Memory)

1,473 Nodes (Intel Omnipath) , 116,152 CPU Cores (Intel)

12 GPGPUs (Nvidia)

Learn more

Filesystems
HLRN-IV Work (EXAScaler)
9412 TB
HLRN-IV Home (GRIDScaler)
340 TB
Archives
HLRN Archivspeicher (Scalar i6000)
7037 TB
HPC-System Grete
Megware GPU Server System
GPU nodes GPU system Tier 2 hpc system

14.760 TFlop/s (Peak Performance) , 74.00 TiB (Main Memory)

126 Nodes , 8,632 CPU Cores (Intel, AMD)

504 GPGPUs (Nvidia)

Learn more

Zentrum für Informations- und Medientechnologie

Düsseldorf
SuperMicro , Unknown , Nvidia DGX A100
GPU nodes

64.00 TiB (Main Memory)

212 Nodes , 6,820 CPU Cores (AMD, Intel, Alpha Data)

230 GPGPUs (Nvidia)

Learn more

Department of Information Services and Computing

Dresden
Im Rechenzentrum des HZDR
NEC
FPGA nodes GPU nodes MPP system SMP nodes Tier 3 hpc system

6.800 TFlop/s (Peak Performance) , 255.00 TiB (Main Memory)

357 Nodes , 29,776 CPU Cores (Intel, AMD)

268 GPGPUs (Nvidia) , 2 FPGAs (AMD XILINX)

Learn more

hessian.AI

Darmstadt
Hewlett Packard Enterprise (HPE) Apollo 6500
GPU nodes

8.830 TFlop/s (Peak Performance) , 159.00 TiB (Main Memory)

81 Nodes , 2,784 CPU Cores (AMD)

632 GPGPUs (Nvidia)

Learn more

Hochschulrechenzentrum

Darmstadt
Intel Cascadelake, Lenovo ThinkSystem SD650 V3
GPU nodes SMP nodes Tier 2 hpc system

8.500 TFlop/s (Peak Performance) , 549.00 TiB (Main Memory)

1,229 Nodes ( HDR100-Infiniband, Mellanox HDR100-Infiniband) , 122,816 CPU Cores (Intel, AMD)

84 GPGPUs (Nvidia, Intel)

Learn more

Filesystems
GLOBAL (DSS)
4000 TB

Höchstleistungsrechenzentrum Stuttgart

Stuttgart
Hewlett Packard Enterprise (HPE)
GPU nodes

24.00 TiB (Main Memory)

24 Nodes , 3,072 CPU Cores (AMD)

192 GPGPUs (Nvidia)

Learn more

HPC-System vulcan
NEC
GPU system Tier 3 hpc system VE nodes

120.00 TiB (Main Memory)

476 Nodes , 14,480 CPU Cores (Intel, NEC)

16 GPGPUs (AMD, Nvidia)

Learn more

HPC-System Hunter
Hewlett Packard Enterprise (HPE) Cray EX4000
GPU nodes MPP system Tier 1 hpc system

48.100 TFlop/s (Peak Performance) , 286.00 TiB (Main Memory)

444 Nodes , 34,432 CPU Cores (AMD)

752 GPGPUs (AMD)

Learn more

IT Center of RWTH Aachen University

Aachen
CLAIX-2023 HPC System
NEC HPC1808Rk-2
GPU nodes MPP system SMP nodes Tier 2 hpc system Tier 3 hpc system

11.484 TFlop/s (Peak Performance) , 226.00 TiB (Main Memory)

684 Nodes , 65,664 CPU Cores (Intel)

208 GPGPUs (Nvidia)

Learn more

Jülich Supercomputing Centre (JSC)

Jülich
Supercomputer JUWELS am Jülich Supercomputing Centre
Atos BullSequana X1000
GPU nodes MPP system SMP nodes Tier 1 hpc system

85.000 TFlop/s (Peak Performance) , 749.00 TiB (Main Memory)

3,515 Nodes (Mellanox EDR-Infiniband, HDR200-Infiniband) , 168,208 CPU Cores (Intel, AMD)

3956 GPGPUs (Nvidia)

Learn more

HPC-System JURECA
Atos BullSequana XH2000
GPU nodes MPP system SMP nodes Tier 2 hpc system

18.520 TFlop/s (Peak Performance) , 444.00 TiB (Main Memory)

780 Nodes (Mellanox HDR100-Infiniband) , 99,840 CPU Cores (AMD)

768 GPGPUs (Nvidia)

Learn more

HPC-System JUPITER Exascale Development Instrument
Eviden Bullsequana XH3000
GPU nodes GPU system MPP system Tier 1 hpc system

94.000 TFlop/s (Peak Performance) , 23.00 TiB (Main Memory)

48 Nodes , 13,824 CPU Cores (Nvidia)

192 GPGPUs (Nvidia)

Learn more

Zuse-Institut Berlin

Berlin-Dahlem
Foto HPC-System Lise
Atos Bull Intel cluster, Eviden
MPP system Tier 2 hpc system

10.707 TFlop/s (Peak Performance) , 677.00 TiB (Main Memory)

1,192 Nodes (Intel Omnipath, NDR-Infiniband) , 113,424 CPU Cores (Intel, AMD)

42 GPGPUs (Nvidia)

Learn more

Filesystems
HLRN-IV WORK (DDN EXAScaler)
8192 TB
HLRN-IV HOME (DDN GRIDScaler)
340 TB
Archives
Archivspeicher (StorageTEK SL8500)
25000 TB

Leibniz-Rechenzentrum der Bayerischen Akademie der Wissenschaften

Garching bei München
Lenovo ThinkSystem SD 650 DWC, ThinkSystem SD650-I V3
GPU nodes MPP system SMP nodes Tier 1 hpc system

54.860 TFlop/s (Peak Performance) , 822.00 TiB (Main Memory)

6,720 Nodes (Intel Omnipath, Mellanox HDR200-Infiniband) , 337,920 CPU Cores (Intel)

960 GPGPUs (Intel)

Learn more

Max Planck Computing & Data Facility

Garching
Lenovo ThinkSystem SD650 V2
GPU nodes MPP system SMP nodes Tier 2 hpc system

24.800 TFlop/s (Peak Performance) , 517.00 TiB (Main Memory)

1,784 Nodes , 128,448 CPU Cores (Intel)

768 GPGPUs (Nvidia)

Learn more

HPC-System Viper
Eviden XH3000
MPP system SMP nodes Tier 2 hpc system

4.900 TFlop/s (Peak Performance) , 445.00 TiB (Main Memory)

768 Nodes , 98,304 CPU Cores (AMD)

Learn more

HPC-System Viper-GPU
Eviden BullSequana XH3000
GPU system MPP system Tier 2 hpc system

36.000 TFlop/s (Peak Performance) , 75.00 TiB (Main Memory)

300 Nodes , 14,400 CPU Cores (AMD)

600 GPGPUs (AMD)

Learn more

Paderborn Center for Parallel Computing

Paderborn
Arbor Workstation 2HE C612 INTEL
FPGA nodes Tier 3 hpc system

512.00 GiB (Main Memory)

8 Nodes , 32 CPU Cores (Intel)

16 FPGAs (Alpha Data)

Learn more

Cray CS500
GPU nodes MPP system Tier 3 hpc system

835 TFlop/s (Peak Performance) , 51.00 TiB (Main Memory)

274 Nodes (Intel Omnipath) , 10,960 CPU Cores (Intel)

18 GPGPUs (Nvidia)

Learn more

Filesystems
Scratch (Cray ClusterStor)
720 TB
Atos Bull Sequana XH2000, Nvidia DGX -A100
FPGA nodes GPU nodes MPP system Tier 2 hpc system

7.100 TFlop/s (Peak Performance) , 347.00 TiB (Main Memory)

1,121 Nodes (Mellanox HDR100-Infiniband) , 143,488 CPU Cores (AMD)

136 GPGPUs (Nvidia) , 80 FPGAs (Bittware, AMD XILINX)

Learn more

Filesystems
Scratch (DDN Exascaler 7990X with NVMe accelerator)
6000 TB

Regionales Hochschulrechenzentrum Kaiserslautern-Landau (RHRZ)

Kaiserslautern
HPC-Cluster Elwetritsch II der TUK
Dell PowerEdge R, NEC HPC1812Rh-1, Nvidia DGX-2, DGX-1
GPU nodes GPU system MPP system SMP nodes SSD storage Tier 3 hpc system Xeon-Phi nodes

3.072 TFlop/s (Peak Performance) , 52.00 TiB (Main Memory)

489 Nodes (Unknown QDR-Infiniband, Intel Omnipath) , 10,520 CPU Cores (Intel, AMD)

56 GPGPUs (Nvidia) , 56 Applications (228 Versions)

Learn more

NEC HPC2824Ri-2, DELTA Computer Products GmbH D22z-M2-ZG, Nvidia DGX B200
GPU nodes GPU system SMP nodes Tier 3 hpc system

489 TFlop/s (Peak Performance) , 68.00 TiB (Main Memory)

244 Nodes ( Omnipath) , 4,240 CPU Cores (AMD, Intel)

56 GPGPUs (Nvidia)

Learn more

Hochschulrechenzentrum

Bonn
Unknown
GPU nodes SMP nodes Tier 3 hpc system

304.00 TiB (Main Memory)

277 Nodes , 28,384 CPU Cores (AMD, Intel)

320 GPGPUs (Nvidia)

Learn more

IT.SERVICES

Bochum
Unknown
GPU nodes Tier 3 hpc system

151.00 TiB (Main Memory)

324 Nodes , 16,512 CPU Cores (AMD)

116 GPGPUs (Nvidia)

Learn more

URZ

Heidelberg
GPU nodes SMP nodes Tier 3 hpc system

147.00 TiB (Main Memory)

432 Nodes , 27,648 CPU Cores (AMD)

276 GPGPUs (Nvidia)

Learn more

Scientific Computing Center

Karlsruhe
bwUniCluster 2.0 Stufe 1
Hewlett Packard Enterprise (HPE) ProLiant, Lenovo ThinkSystem
GPU nodes MPP system SMP nodes Tier 3 hpc system

155.00 TiB (Main Memory)

837 Nodes ( HDR200-Infiniband) , 40,608 CPU Cores (Intel)

196 GPGPUs (Nvidia)

Learn more

HoreKa
Lenovo ThinkSystem SD650 V2
GPU nodes MPP system Tier 2 hpc system

18.520 TFlop/s (Peak Performance) , 242.00 TiB (Main Memory)

769 Nodes ( HDR200-Infiniband) , 58,444 CPU Cores (Intel)

668 GPGPUs (Nvidia)

Learn more

Scientific Computing Center

Eggenstein-Leopoldshafen

IT und Medien Centrum

Dortmund
Megware
GPU nodes Tier 3 hpc system

30.00 TiB (Main Memory)

366 Nodes , 8,160 CPU Cores (Intel)

40 GPGPUs (Nvidia)

Learn more

Universität Bielefeld – Fakultät Physik

Bielefeld
GPU system Tier 3 hpc system

11.00 TiB (Main Memory)

28 Nodes , 560 CPU Cores (Intel)

224 GPGPUs (Nvidia)

Learn more

Zentrum für Informations- und Mediendienste

Essen
NEC
Tier 3 hpc system

55.00 TiB (Main Memory)

624 Nodes , 14,976 CPU Cores (Intel)

Learn more

HPC-System amplitUDE
Megware
GPU nodes SMP nodes Tier 3 hpc system

5.500 TFlop/s (Peak Performance) , 177.00 TiB (Main Memory)

259 Nodes , 29,008 CPU Cores (Intel)

68 GPGPUs (Nvidia)

Learn more

Regionales Rechenzentrum (RRZ)

Hamburg
HPC-System Hummel
NEC
GPU nodes SMP nodes Tier 3 hpc system

2.640 TFlop/s (Peak Performance) , 147.00 TiB (Main Memory)

182 Nodes , 34,432 CPU Cores (AMD)

32 GPGPUs (Nvidia)

Learn more

Zentrum für Informations- und Medientechnologie

Siegen
Unknown
GPU nodes Tier 3 hpc system

119.00 TiB (Main Memory)

459 Nodes , 29,376 CPU Cores (Intel, AMD)

20 GPGPUs (Nvidia)

Learn more

Kommunikations- und Informationszentrum (kiz)

Ulm
FPGA nodes GPU nodes Tier 3 hpc system

188.00 TiB (Main Memory)

692 Nodes , 33,216 CPU Cores (Intel)

28 GPGPUs (Nvidia) , 2 FPGAs (Bittware)

Learn more

Zentrum für Informationsverarbeitung

Münster
Megware

67.00 TiB (Main Memory)

412 Nodes , 15,120 CPU Cores (Intel)

Learn more

Zentrum für Datenverarbeitung

Mainz
HPC-System Mogon 2
Megware MiriQuid, NEC NEC Cluster, Intel NEC Cluster
MPP system SSD storage Tier 2 hpc system

3.125 TFlop/s (Peak Performance) , 190.00 TiB (Main Memory)

1,948 Nodes (Intel Omnipath) , 52,248 CPU Cores (Intel)

188 GPGPUs (Nvidia, NEC)

Learn more

Filesystems
Mogon (GPFS)
1024 TB
Lustre (Lustre)
7394 TB
MOGON NHR Süd-West im Serverraum der JGU
GPU nodes Tier 2 hpc system

2.800 TFlop/s (Peak Performance) , 219.00 TiB (Main Memory)

600 Nodes ( HDR100-Infiniband) , 76,800 CPU Cores (AMD)

40 GPGPUs (Nvidia)

Learn more

Center for Information Services and High Performance Computing

Dresden
NEC HPC 22S8Ri-4

5.443 TFlop/s (Peak Performance) , 34.00 TiB (Main Memory)

34 Nodes ( HDR200-Infiniband) , 1,632 CPU Cores (AMD)

272 GPGPUs (Nvidia)

Learn more

Filesystems
walrus (ES400NVX2)
20000 TB
/home (Intelliflash)
2000 TB
HPC-System Barnard
Eviden XH2000

4.050 TFlop/s (Peak Performance) , 315.00 TiB (Main Memory)

630 Nodes ( HDR100-Infiniband) , 65,520 CPU Cores (Intel)

Learn more

Filesystems
horse (ES400NVX2)
20000 TB
walrus (ES400NVX2)
20000 TB
/home (Intelliflash)
2000 TB
NEC Gigabyte

762 TFlop/s (Peak Performance) , 94.00 TiB (Main Memory)

188 Nodes ( HDR200-Infiniband) , 24,064 CPU Cores (AMD)

Learn more

Filesystems
horse (ES400NVX2)
20000 TB
walrus (ES400NVX2)
20000 TB
/home (Intelliflash)
2000 TB
Hewlett Packard Enterprise (HPE) Superdome Flex

28 TFlop/s (Peak Performance) , 48.00 TiB (Main Memory)

25,088 CPU Cores (Intel)

Learn more

Filesystems
horse (ES400NVX2)
20000 TB
walrus (ES400NVX2)
20000 TB
/home (Intelliflash)
2000 TB
Megware Lenovo ThinkSystem SD665-N V3

39.664 TFlop/s (Peak Performance) , 111.00 TiB (Main Memory)

148 Nodes ( HDR200-Infiniband) , 9,472 CPU Cores (AMD)

592 GPGPUs (Nvidia)

Learn more

Filesystems
horse (ES400NVX2)
20000 TB
walrus (ES400NVX2)
20000 TB
/home (Intelliflash)
2000 TB
cat (Weka-Cluster)
2000 TB

Erlangen National Center for High Performance Computing (NHR@FAU)

Erlangen
Megware Linux-Cluster
MPP system Tier 3 hpc system

511 TFlop/s (Peak Performance) , 46.00 TiB (Main Memory)

728 Nodes (Intel Omnipath) , 14,560 CPU Cores (Intel)

21 Applications (61 Versions)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
HPC-System Alex
Megware GPGPU Cluster, GPU-Cluster, NF5488A5
GPU nodes GPU system Tier 2 hpc system Tier 3 hpc system

6.080 TFlop/s (Peak Performance) , 78.00 TiB (Main Memory)

82 Nodes (Mellanox HDR200-Infiniband, HDR200-Infiniband) , 10,496 CPU Cores (AMD)

656 GPGPUs (Nvidia)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Fritz (Lustre)
3700 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
HPC-System Fritz
Megware D50TNP
MPP system Tier 2 hpc system Tier 3 hpc system

5.450 TFlop/s (Peak Performance) , 248.00 TiB (Main Memory)

992 Nodes (Mellanox HDR100-Infiniband) , 71,424 CPU Cores (Intel)

Learn more

Filesystems
Global (GPFS/NFS)
50 TB
Saturn (HPE Apollo 4510 Gen10)
450 TB
Titan (HPE Apollo 4510 Gen10)
450 TB
Fundus-Disk (GPFS)
6000 TB
Atuin (PowerEdge XE7100/XE7440)
1000 TB
Fritz (Lustre)
3700 TB
Archives
Fundus-Tape (IBM TS4500)
15000 TB
HPC-System Helma
Megware Lenovo ThinkSystem SD665-N V3
GPU system Tier 2 hpc system

51.920 TFlop/s (Peak Performance) , 144.00 TiB (Main Memory)

192 Nodes , 24,576 CPU Cores (AMD)

768 GPGPUs (Nvidia)

Learn more