New image cluster setup

From ImageWiki

Jump to: navigation, search

Contents

File servers

Storage is split between two storage servers (nfs1 and nfs2), each having two RAID 5 arrays with 12 x 1 TB disks. This gives in total 4 x 10 TB arrays.

They are currently mounted on each server as

/home (11 TB on nfs2 /dev/mapper/vg0-user_1)
/image/data1 (11 TB on nfs2 /dev/mapper/vg1-user_2)
/image/data2 (11 TB on nfs1)
/image/data3 (11 TB on nfs1)

Compute servers

Network

The DNS configuration currently looks like this:

a00283.science.ku.dk. IN A 192.38.118.63
a00284.science.ku.dk. IN A 130.226.12.98
a00285.science.ku.dk. IN A 130.226.12.99
a00286.science.ku.dk. IN A 130.226.12.100
a00287.science.ku.dk. IN A 130.226.12.101
a00288.science.ku.dk. IN A 130.226.12.102
a00289.science.ku.dk. IN A 130.226.12.103
ssh-diku-image.science.ku.dk.       IN CNAME a00283.science.ku.dk.
nfs1-diku-image.science.ku.dk.      IN CNAME a00284.science.ku.dk.
nfs2-diku-image.science.ku.dk.      IN CNAME a00285.science.ku.dk.
compute01-diku-image.science.ku.dk. IN CNAME a00286.science.ku.dk.
compute02-diku-image.science.ku.dk. IN CNAME a00287.science.ku.dk.
compute03-diku-image.science.ku.dk. IN CNAME a00288.science.ku.dk.
compute04-diku-image.science.ku.dk. IN CNAME a00289.science.ku.dk.

Access to the enclosure

To get access to the enclosures OA/iLO management software first make a ssh-tunnel:

ssh -L15443:a00279.science.domain:443 SCIENCE\\<ku-login>@ssh-diku-image.science.ku.dk

The connect via a browser to

https://localhost:15443/

If you have access rights login with your KU-login.

Installing software using the package manager

OpenSUSE and SLES uses the zypper software package manager.

Here is a couple of commands that are useful.

 sudo zypper refresh
 sudo zypper search <package name>
 sudo zypper install <package name>

For more information see the manual.

Installation notes

Personal tools