Logo Science and Innovation Alliance Kaiserslautern e. V. Science and Innovation Alliance Kaiserslautern e.V.

Gauß-Allianz
Member
Go to first section

About Science and Innovation Alliance Kaiserslautern e.V.

Die TU Kaiserslautern sowie das Fraunhofer ITWM sind Mitglied der Science and Innovation Alliance Kaiserslautern e.V. und haben in diesem Verbund eine strategische Allianz für das Hochleistungsrechnen in Kaiserslautern geschlossen. Die TU Kaiserslautern betreibt am Regionalen Hochschulrechenzentrum Kaiserslautern (RHRK) den Hochleistungsrechner Elwetritsch und das Fraunhofer ITWM drei Parallelrechner am dortigen Competence Center High Performance Computing (CC-HPC). Darüber hinaus bietet das RHRK die HPC-Infrastruktur für eine Fülle von Forschungsvorhaben in Kaiserslautern an, so z.B. den SFB-TR 49 "Condensed Matter Systems with Variable Many-Body Interactions", den SFB-TR 88 "Kooperative Effekte in homo- und heterometallischen Komplexen", den SFB 926 "Bauteiloberflächen: Morphologie auf der Mikroskala". Die HPC Infrastruktur am Fraunhofer ITWM wird von den 400 Mitarbeitern des ITWM für Entwicklungen im Bereich der Materialforschung, der Nutzfahrzeugtechnologie sowie der Optimierung und Verfahrenstechnik in enger Zusammenarbeit mit Wissenschaftlern der TU Kaiserslautern eingesetzt. Mitarbeiter des Fraunhofer ITWM nutzen diese Infrastruktur in zahlreichen EU und BMBF Projekten, darunter insbesondere auch Europäische Exascale Projekte.

HPC Expertise

  • Algorithmisches Differenzieren und Optimierung
  • Programmiermodelle & Standards
  • Visualisierung
  • Geowissenschaften/Geophysik
  • Strömungsmechanik, Mechanik

Regionales Hochschulrechenzentrum Kaiserslautern

The RHRK is a central scientific institution of the University of Kaiserslautern. The RHRK provides students, faculties, central facilities and the central administration with IT infrastructure and IT services according to their needs. In order to support its service tasks, RHRK performs its own research and development.

HPC Expertise

For the special support of high-performance computing, the RHRK is responsible in particular

  • to monitor and evaluate the development, use and application of methods, tools and machines in high-performance computing and to advise users at the university on the use of their local computing capacities and central high-performance computing technology,
  • to adapt and provide the latest methods and their software implementation for the solution of the typical university application spectrum on the existing architectures,
  • to provide training and individual advice on algorithms and their efficient implementation on high-performance computers,
  • to support current and potential users in the analysis and optimization of their programs, to broaden the circle of users by demonstrating new application possibilities and increasing the user-friendliness of the installed high-performance computing technology,
  • to supervise bachelor, master and doctoral theses together with the departments, to participate in courses and to conduct continuous lecture and publication activities.


The RHRK coordinates its work with other centres of excellence for high-performance computing in the state of Rhineland-Palatinate and in Germany.

HPC Systems

HPC-Cluster Elwetritsch II der TUK

Elwetritsch II - Dell PowerEdge R NEC HPC1812Rh-1 Nvidia DGX-2, DGX-1

since January 2016 - Last Update: January 2020 - learn more
GPU nodes GPU system MPP system SMP nodes SSD storage Tier 3 hpc system Xeon-Phi nodes
489 Nodes
Unknown QDR-Infiniband, Intel Omnipath
53 TB
Main Memory
10,520 CPU Cores
Intel, AMD
56 GPGPUs
Nvidia
56 Applications
228 Versions

Beehive - Dell PowerEdge

since January 2014
GPU nodes MPP system SMP nodes Tier 3 hpc system
67 TFlop/s
Peak Performance
198 Nodes
Unknown FDR-Infiniband
14 TB
Main Memory
3,224 CPU Cores
Intel
2 GPGPUs
Nvidia

Elwetritsch - Fujitsu PRIMERGY XC250/400

since August 2012 - learn more
GPU nodes MPP system SMP nodes Tier 3 hpc system Xeon-Phi nodes
134 TFlop/s
Peak Performance
319 Nodes
Unknown QDR-Infiniband
17 TB
Main Memory
5,624 CPU Cores
Intel
29 GPGPUs
Nvidia
6 Many Core Processors
Intel
56 Applications
228 Versions

Seislab

GPU nodes MPP system Tier 3 hpc system Xeon-Phi nodes
35 TFlop/s
Peak Performance
90 Nodes
Unknown FDR-Infiniband, QDR-Infiniband
6 TB
Main Memory
1,584 CPU Cores
Intel
3 GPGPUs
Nvidia
2 Many Core Processors
Intel

Ocean 1

Tier 3 hpc system
1,450 Nodes
23 TB
Main Memory
11,600 CPU Cores
Intel

File Systems

WORK file system

Seislab (BeeGFS)

HDD Symbol
600 TB
disk storage

Beehive (BeeGFS)

HDD Symbol
240 TB
disk storage

Elwetritsch (BeeGFS)

since April 2018
HDD Symbol
1285 TB
disk storage

HOME file system

Elwetritsch (NFS)

HDD Symbol
10 TB
disk storage

Contact

Science and Innovation Alliance Kaiserslautern
Paul-Ehrlich-Straße 32
67663 Kaiserslautern
Germany

 Send a message to SIAK