- Erstellt von IT Admin, zuletzt geändert von Potthoff, Sebastian am 03. Dez. 2019
Sie zeigen eine alte Version dieser Seite an. Zeigen Sie die aktuelle Version an.
Unterschiede anzeigen Seitenhistorie anzeigen
« Vorherige Version anzeigen Version 32 Nächste Version anzeigen »
The ZIV is providing HPC resources to aid researchers conduct computationally demanding tasks. For this purpose we are hosting the PALMA-II HPC cluster.
Requirements
In order to gain access to PALMA-II, you have to:
- Register for the group u0clstr at MeinZIV
- Change your main group according to your work group at MeinZIV
- Register at our HPC mailing list
- Read the how-to guides provided on this website
After registering it can take up to 24h until you can login at PALMA-II!
Publications
We kindly ask you to record any publication you made with the help of our HPC systems in the research database with an appropriate note. Instructions can be found here.
Lectures
Once a semester there is a lecture about parallel programming and the usage of our HPC system. This lecture is intended for people with prior knowledge in Linux and C/C++ or Fortran programming. It is not an introductory course, however, if you just want to inform yourself on the basic usage of PALMA-II, you are welcome to join the first day of this lecture.
News on HPC
Get important updates and news on HPC topics through our mailing list and have a look at the News page!
Getting started
A first overview can be found at the Getting started section. More specifics can be found at the individual pages in the sidebar.
Contact
If you have questions regarding High Performance Computing at the ZIV, you can write to hpc@uni-muenster.de.
General HPC WIKI
In a joint effort of different German Universities, many useful tips and general information about HPC computing can be found at https://hpc-wiki.info.
Search this documentation
Manufacturer | MEGWARE |
Cores | 15,120 |
Memory | 72,384 GB |
Nodes | 412 |
Processor | Intel Xeon Gold 6140 18C @ 2.30GHz (Skylake) |
Interconnect | 100Gbit/s Intel Omni-Path |
GPFS Storage | 1 PB |
Linpack Performance | Rmax: 800 TFlop/s Rpeak: 1,277 TFlop/s |
Important News and Announcements
Recently Updated
- Keine Stichwörter