Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Kickstart Tutorial Seminar on using the 4 -nodes pleon Cluster in Science faculty June 11. 2003
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Kickstart Tutorial/Seminar on using the 64-nodes P4-Xeon Cluster in Science Faculty June 11, 2003
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Aims and target audience Aims Provide a kickstart tutorial to potential cluster users in Science faculty. HKBu Promote the usage of the PC cluster in Science Faculty Target audience Science faculty students referred by their project/thesis Supervisors Staff who are interested in High Performance Computing
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Aims and target audience • Aims: – Provide a kickstart tutorial to potential cluster users in Science Faculty, HKBU – Promote the usage of the PC cluster in Science Faculty • Target audience – Science Faculty students referred by their project/thesis supervisors – Staff who are interested in High Performance Computing
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Outline Brief introduction Hardware, software, login and policy How to write and run program on multiple CPUS Simple mpl programming Resources on mpi documentation Demonstration of software installed SPRNG BLAS. NAMD2 GAMESS PGI
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Outline • Brief introduction – Hardware, software, login and policy • How to write and run program on multiple CPUs – Simple MPI programming – Resources on MPI documentation • Demonstration of software installed – SPRNG, BLAS, NAMD2, GAMESS, PGI
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Bought by Teaching Development Grant
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Bought by Teaching Development Grant
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Hardware Configuration I master node+ 64 compute nodes gigabit Interconnection Master node Dell pe2650. p4-Xeon 2 8GHZ X 2 4GB RAM, 36GBX2 U160 SCSI(mirror) Gigabit ethernet ports x 2 SCSi attached storage Dell pv220S 73GB X 10 (RAIDS)
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Hardware Configuration • 1 master node + 64 compute nodes + Gigabit Interconnection • Master node – Dell PE2650, P4-Xeon 2.8GHz x 2 – 4GB RAM, 36GB x 2 U160 SCSI (mirror) – Gigabit ethernet ports x 2 • SCSI attached storage – Dell PV220S – 73GB x 10 (RAID5)
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Hardware Configuration(cont) Compute nodes Dell pe2650. p4-Xeon 2, 8GHz X 2 2GB RAM. 36GB U160 SCSI HD Gigabit ethernet ports x 2 Gigabit Interconnect Extreme blackdiamond 6816 Gigabit ethernet 256Gb backplane 72 Gigabit ports(8 ports card x 9)
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Hardware Configuration (cont) • Compute nodes – Dell PE2650, P4-Xeon 2.8GHz x 2 – 2GB RAM, 36GB U160 SCSI HD – Gigabit ethernet ports x 2 • Gigabit Interconnect – Extreme Blackdiamond 6816 Gigabit ethernet – 256Gb backplane – 72 Gigabit ports (8 ports card x 9)
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Software installed Cluster operating system RockS2.3.2fromwww.rocksclusters.org · MPI and pvm libraries LAM/MPI 6.5.9. MPICH 1.2.5 Pvm3.4.3-6beolin Compilers GCC2.96.GCC3.2.3 PGI C/C++/f77/f90/hpf version 4.0 MATH libraries atlaS 3. 1. ScaLAPACK SPRNG 20a Application software MATLAB 6.1 with mPitB Gromacs 3. 1. 4. NAMD2.5b1. Gamess · Editors VI, pIco, emacs, Joe Queuing system OpenPBs 2.3.16, Maui scheduler
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Software installed • Cluster operating system – ROCKS 2.3.2 from www.rocksclusters.org • MPI and PVM libraries – LAM/MPI 6.5.9, MPICH 1.2.5, PVM 3.4.3-6beolin • Compilers – GCC 2.96, GCC 3.2.3 – PGI C/C++/f77/f90/hpf version 4.0 • MATH libraries – ATLAS 3.4.1, ScaLAPACK, SPRNG 2.0a • Application software – MATLAB 6.1 with MPITB – Gromacs 3.1.4, NAMD2.5b1 , Gamess • Editors – vi, pico, emacs, joe • Queuing system – OpenPBS 2.3.16, Maui scheduler
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Cluster OS.-ROCKS 2.3.2 Developed by npaci and sdsc Based on redhat 7.3 Allow setup of 64 nodes in 1 hour Useful command for users to monitor jobs in all nodes. E. g cluster -fork date cluster-ps morriS cluster-kill morris Web based management and monitoring http://tdgrocks.sci.hkbu.edu.hk
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Cluster O.S. – ROCKS 2.3.2 • Developed by NPACI and SDSC • Based on RedHat 7.3 • Allow setup of 64 nodes in 1 hour • Useful command for users to monitor jobs in all nodes. E.g. – cluster-fork date – cluster-ps morris – cluster-kill morris • Web based management and monitoring – http://tdgrocks.sci.hkbu.edu.hk
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Ganglia Ganglia Cluster Toolkit:: Closter Report- Microsoft Internet Explorer 榴案辑国核视①我的最要(工具①就明 上真·冈图的搜我最爱的媒的③,品图 址①http:/dmockscihkbuedahk/gangiarwebfrontend Cluster Report for Sat, 7 Jun 2003 15: 26: 33+0800 Get Fresh Data anglia (ROCKS Plysical View ten Metricload d_oney Last hour y Sorted descending v Teaching Development Grant Cluster>-Choose a Node v Overview of Teaching Development Grant Cluster Teaching Developnent Grant Cluster CPU last hour There are 49 nodes(98 CPUs)up and running There are no nodes down 口 User CPU口N PU口 systen CPU口 idle cp Current Chuster Load: 29.1.. 28.29 Teaching Developnent Grant Cluster MEM last hour Teaching Development Grant Cluster LOAD last hour 50C t- 口1 Minute Lo4d口 Nodes口 otal CPUs■ Running proce5se5 口 Memory Buttered口 Memo ry swapped■ Total In- ore Ne Snapshot of Teaching Development Grant Chister Legend 国国国国国画图图国国国画国国画国国国画国国 ahttp:/tdgrocksscihkbueduhw/ganeli-webfrontend/t-teachingDevelopmentGrantClnster&h-comp-pvfs-0-18&m-boed_one&rhour&s 额浮路
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Ganglia
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU PBS Job queue 棺案辑①檢⑦我的最爱()工具(①就明田 ⊙上真凶沿搜得次我是的媒③,品,回 剩址①k数MuM如 webfrontend/addons ocks/queue php N閉移至 Teaching development grant Cluster Parallel Job Queue ROCKS Sat, 7 Jun 2003 15: 30: 17+0800 Physical Job Assignments Show only jobs for user. ld User Processors State Name RuntimeUpIDown 333 heluo RL计+GE1day,2020:34 R RI计+GE 338 mpu 339 mpu Rc11day.193544 R 19:3523 R lay,193458 Rc11day,193430 c11day.193334 344 R c11day.193237
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU PBS Job queue