Versionen im Vergleich

Schlüssel

  • Diese Zeile wurde hinzugefügt.
  • Diese Zeile wurde entfernt.
  • Formatierung wurde geändert.

Search

this documentation

the HPC-Wiki

Livesuche
spaceKeyHPC
sizelarge


MEMBER OF

Image Added




Panel
borderColorblack
bgColor#F4F5F7
borderStylesolid

Introduction

The WWU IT CIT (Center for Information Technology) is providing HPC resources to aid researchers conduct computationally demanding tasks. For this purpose we are hosting the PALMA-II HPC cluster. It is meant to be used to run applications which are highly parallelized and can take advantage of hundreds or thousands of cores simultaneously and/or need very large amounts of memory.

The HPC team is maintaining the cluster and trying to keep it operational 24/7. We are also providing a default set of software to use on the cluster and try our best to help with installing or optimizing programs you want to run. However, time is limited and sometimes the applications you want to run on the cluster might just not be suited for this use-case. Please inform yourself about what resources you require and if you can actually make use of an HPC cluster.

What HPC is not

The cluster is not a very fast computer replacing your laptop or workstation. Programs on HPC clusters, normally, do not make use of any GUI and calculations are typically submitted to a batch system via so called job scripts. Users should be familiar with the Linux command line and should thoroughly read the information provided in this wiki.

Latest News and Announcements

Anzeige untergeordneter Seiten
pageNews
sortcreation
reversetrue
first4

Warnung
titleCALCULATIONS ON THE LOGIN NODE ARE NOT ALLOWED!

The login node of PALMA is not a place to start any serious calculations nor is it a playground for testing purposes or compiling programs! Any user processes violating this rule will be terminated immediately without any warning!

Image Removed

ManufacturerMEGWARECores15,120Memory72,384 GBNodes412ProcessorIntel Xeon Gold 6140 18C @ 2.30GHz (Skylake)Interconnect100Gbit/s Intel Omni-PathGPFS Storage

1 PB

Linpack Performance

Rmax: 800 TFlop/s

Rpeak: 1,277 TFlop/s

OSCentOS 7


Users who are completely new to the Linux commandline and HPC are strongly advised to take the Introduction to Linux in HPC tutorial.


Requirements

You

Requirements

Access TL;DR

  1. Register your research group by filling out the form and sending it via mail - wait until you hear back from us (only once per group by head of group)
  2. Register at u0clstr in the IT Portal - WAIT 24H!
  3. Upload your SSH key in the IT Portal (HPC-Systems is only visible if you finished step 2.)
  4. Register at mailing list, join Mattermost HPC-Users channel
  5. Login via SSH
Spalte
width50%
In order to gain access to PALMA-II, you have to:
  • Register your research group for the usage of the HPC resources.
    This has to be done only once per research group. The application can be found on the right. A quick reference card guiding you through the steps of your application can also be found on the right.
Spalte
width25%

Application

View file
namehpc_application_wwuit.pdf
height250

Spalte
width25%

Quick-Reference-Card

View file
nameQuick_Ref_Card_Muenster.pdf
height250

Hinweis

1. After registering it can take up to 24h until you can login at PALMA-II!

2. It is not sufficient to just be a member of u0clstr. You also

have to be a member of a registered

research group!3."Registered

research group

" means one of the groups that either already preexist in the IT portal, or which can be created following the rules outlined here. Group names consist of up to eight characters, always start with a letter, followed by a digit, followed by a character string, like in "u0clstr". Do not just invent arbitrary names on your application form. Projects can only be created by project leaders, i.e. regular staff members. Join the group of your professor if you are a student. You cannot register groups with a broad scope like "u0dawin", "p0stud", "u0mitarb" and the like. Groups on institute level are ok only if the institute is small, otherwise a group on chair level is preferrable. If you still do not know what a research group is, please ask before sending in an application

of the University of Münster. Detailed information about the registration process can be found here.

Publications

We kindly ask you to record any publication you made with the help of our HPC systems at CRIS.WWU the Research portal with an appropriate note. Instructions can be found here.

Lectures

Once a semester there is a lecture about parallel programming and the usage of our HPC system. This lecture is intended for people with prior knowledge in Linux and C/C++ or Fortran programming. It is not an introductory course, however, if you just want to inform yourself on the basic usage of PALMA-II, you are welcome to join the first day of this lecture

Furthermore, please consider acknowledging the use of the PALMA II cluster in your publications:

Info
titleAcknowledgement

Calculations (or parts of them) for this publication were performed on the HPC cluster PALMA II of the University of Münster, subsidised by the DFG (INST 211/667-1).

News on HPC

Get important updates and news on HPC topics through our mailing list and have a look at the News-Archive page!

Join the HPC-Users channel on Mattermost.

Getting started

A first overview can be found at the Getting started section. More specifics can be found at the individual pages of the Wiki

Lectures

We are offering an Introduction to Linux for HPC one-week seminar. Users with and without prior knowledge in Linux and HPC can choose to participate the whole week or just on individual days on specific topics. For more information follow the provided link.

Contact & Support

If you have questions regarding High Performance Computing, you can write to hpc@uni-muenster.de. (in English or German)

A few Do's and Dont's when writing:

  • Please use your uni-muenster.de e-mail address
or at least
  • and specify your user account when contacting us
  • Don't send log files or submission scripts or the like. Better just send the path on the cluster to those files if possible.

HPC consulting hour

We are also offering an HPC consulting hour ("Sprechstunde") via Zoom, where you can directly talk to one of our support staff:

  • Every Tuesday from 11-12am
  • Zoom Link will be in the Mattermost HPC-Users channel (see above).

HPC.NRW

The WWU is part of the The North Rhine-Westphalia Competence Network for High Performance Computing (HPC.NRW).

It offers a competent first point of contact and central advisory hub with a broad knowledge base for HPC users in NRW.

A network of thematic clusters for low-threshold training, consulting and coaching services has been created within the framework of the competence network HPC.NRW. The aim is to make effective and efficient use of high-performance computing and storage facilities and to support scientific researchers of all levels. The existing resources and services that the state has to offer are also presented in a transparent way.

Image Added

General HPC WIKI

In a joint effort of different German Universities, many useful tips and general information about HPC computing can be found at https://hpc-wiki.info.



Image Added
ManufacturerMEGWARE
Cores16,272
Memory77,568 GB
Nodes444
ProcessorIntel Xeon Gold 6140 18C @ 2.30GHz (Skylake)
Interconnect100Gbit/s Intel Omni-Path
GPFS Storage

2,4 PB

Linpack Performance

Rmax: 800 TFlop/s

Rpeak: 1,277 TFlop/s

OSCentOS 7