User Tools

Site Tools


services:cluster:start

This is an old revision of the document!


Information about the HPC-Cluster

In order to get access to the department of physics HPC resources you need to send an email to hpc@physik.fu-berlin.de. Please give some information on the kind of jobs you are planing to run and software you plan to use, if possible.

2015-06-16: Currently there are two HPC clusters in production at the physics department. One is located in the HLRN datacenter, the other is located at ZEDAT. See the table below for further information on available resources. Both cluster only share a common /home. Everything else like the queuing system ot /scratch are distinct. Both are running Debian/Wheezy and utilize the Slurm scheduling system.

Slurm documentation

General documentation

Overview of available resources

The following table lists some HPC resources available at the physics department. At the end of the table we also list the resources for the ZEDAT soroban cluster. The tron cluster at Takustrasse 9 is currently restructured. We also have some special purpose nodes that are currently through managed by Slurm.

Hosts Manager Nodes Form Hardware CPU Speed Core/Node RAM/Core RAM/Node #RAM #Cores
tron cluster - FB Physik - Location: Takustrasse 9
n010-n041 offline 32 2U Twin2 Dell C6100 2x Xeon X5650 2.66GHz 12 8G 96G 3072G 384
n110-n111 offline 2 2U Dell C6145 4x Opteron 6128HE 2.0GHz 32 4G 128G 256G 64
n112-n127 offline 16 Blade Dell M600 2x Xeon E5450 3.00GHz 8 2G 16G 256G 128
n128-n143 offline 16 Blade Dell M600 2x Xeon E5450 3.00GHz 8 2G 16G 256G 128
n144-n175 offline 32 Blade Dell M610 2x Xeon X5570 2.93GHz 8 6G 48G 1536G 256
n176-n183 offline 8 4U HP DL580 4x Xeon X7560 2.26Ghz 32 8G 256G 2048G 256
#Taku9 G
sheldon/leonard cluster - FB Physik - Location: HLRN - OS: Debian/Jessie
x001-x192 SLURM1) 192 Blade SGI Altix ICE 8200 2x Xeon X5570 2.93GHz 8 6G 48G 9216G 1536
uv1000 none 1 42U SGI UV 1000 64x Xeon X7560 2.26Ghz 512 4G 2T 2048G 512
#HLRN 193 11264G 2048
yoshi cluster - FB Physik - Location: ZEDAT - OS: Debian/Wheezy
y001-y128 SLURM1) 128 Blade HP BL460c G6 2x Xeon X5570 2.93GHz 8 6G 48G 6144G 1024
ygpu01-ygpu31 SLURM2) 31 2U GPU Nodes (2x Nvidia Tesla M2070) IBM iDataPlex dx360 M3 2x Xeon X5570 2.93GHz 8 3G 24G 744G 248
#Ph-ZEDAT 159 6888G 1272
soroban cluster - ZEDAT-HPC - Location: ZEDAT
node001-002 SLURM 2 1U Twin Asus Z8NH-D12 2x Xeon X5650 2.66GHz 12 8G 48G 96G 24
node003-030 SLURM 28 1U Twin Asus Z8NH-D12 2x Xeon X5650 2.66GHz 12 4G 24G 672G 336
node031-100 SLURM 70 1U Twin Asus Z8NH-D12 2x Xeon X5650 2.66GHz 12 8G 48G 3360G 840
node101-112 SLURM 12 1U Twin Asus Z8NH-D12 2x Xeon X5650 2.66GHz 12 16G 96G 1152G 144
#ZEDAT 112 5280G 1344
ausgemusterte Systeme
Abacus4 8 IBM p575 16x POWER 5+ 1.9Ghz 32 4G 128G 1024G 256

Operating System: Debian Linux Squeeze (x64)
1) in production but still experimental
2) work in progress

(15.03.2014)

services/cluster/start.1496928422.txt.gz · Last modified: 2017/06/08 13:27 by dreger

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki