Summary
2 clusters, 89 nodes, 888 cores, 7.1 TFLOPS
Cluster
|
Queue
|
Date of arrival
|
Nodes
|
CPU
|
Cores
|
Memory
|
Storage
|
Network
|
suno |
default |
2010-01-27 |
45 |
2 x Intel Xeon E5520 |
4 cores/CPU |
32 GB |
[1-27,29-45]: 557 GB HDD [28]: 278 GB HDD |
1 Gbps
|
uvb |
default |
2011-01-04 |
44 |
2 x Intel Xeon X5670 |
6 cores/CPU |
96 GB |
232 GB HDD |
1 Gbps + 2 x 40 Gbps InfiniBand
|
Cluster details
suno
45 nodes, 90 cpus, 360 cores, split as follows due to differences between nodes (json)
- suno-[1-2,8-9,11,14-15,20,22-25,27,30-37,42-44] (24 nodes, 48 cpus, 192 cores)
Model:
|
Dell PowerEdge R410
|
Date of arrival:
|
2010-01-27
|
CPU:
|
Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
|
Memory:
|
32 GB
|
Storage:
|
557 GB HDD SAS PERC 6/i Adapter (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:03:00.0-scsi-0:2:0:0)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2
- eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2 - unavailable for experiment
|
- suno-[3,5-6,10,16-17,21,38-41,45] (12 nodes, 24 cpus, 96 cores)
Model:
|
Dell PowerEdge R410
|
Date of arrival:
|
2010-01-27
|
CPU:
|
Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
|
Memory:
|
32 GB
|
Storage:
|
557 GB HDD SAS PERC 6/i Adapter (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:03:00.0-scsi-0:2:0:0)
|
Network:
|
- eth0, Ethernet, configured rate: 1 Gbps, model: N/A, driver: bnx2
- eth1, Ethernet, driver: bnx2 - unavailable for experiment
|
- suno-[4,7,12-13,18-19,26,29] (8 nodes, 16 cpus, 64 cores)
Model:
|
Dell PowerEdge R410
|
Date of arrival:
|
2010-01-27
|
CPU:
|
Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
|
Memory:
|
32 GB
|
Storage:
|
557 GB HDD SAS PERC 6/i Adapter (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:03:00.0-scsi-0:2:0:0)
|
Network:
|
- eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2
- eth1/enp1s0f1, Ethernet, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2 - unavailable for experiment
|
- suno-[28] (1 node, 2 cpus, 8 cores)
Model:
|
Dell PowerEdge R410
|
Date of arrival:
|
2010-01-27
|
CPU:
|
Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
|
Memory:
|
32 GB
|
Storage:
|
278 GB HDD SAS PERC 6/i Adapter (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:03:00.0-scsi-0:2:0:0)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2
- eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2 - unavailable for experiment
|
uvb
44 nodes, 88 cpus, 528 cores, split as follows due to differences between nodes (json)
- uvb-[1-2,4-6,8-29,31,33-44] (40 nodes, 80 cpus, 480 cores)
Model:
|
Dell PowerEdge C6100
|
Date of arrival:
|
2011-01-04
|
CPU:
|
Intel Xeon X5670 (Westmere, 2.93GHz, 2 CPUs/node, 6 cores/CPU)
|
Memory:
|
96 GB
|
Storage:
|
232 GB HDD SATA WDC WD2502ABYS-1 (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
- eth1/eno2, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
- ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
- ib0.8100, InfiniBand, configured rate: 40 Gbps, model: N/A, driver: mlx4_core
- ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment
|
- uvb-[3,7] (2 nodes, 4 cpus, 24 cores)
Model:
|
Dell PowerEdge C6100
|
Date of arrival:
|
2011-01-04
|
CPU:
|
Intel Xeon X5670 (Westmere, 2.93GHz, 2 CPUs/node, 6 cores/CPU)
|
Memory:
|
96 GB
|
Storage:
|
232 GB HDD SATA WDC WD2502ABYS-1 (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
- eth1/eno2, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
- ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
- ib0.8100, InfiniBand, configured rate: 40 Gbps, model: N/A, driver: mlx4_core
- ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment
|
- uvb-[30] (1 node, 2 cpus, 12 cores)
Model:
|
Dell PowerEdge C6100
|
Date of arrival:
|
2011-01-04
|
CPU:
|
Intel Xeon X5670 (Westmere, 2.93GHz, 2 CPUs/node, 6 cores/CPU)
|
Memory:
|
96 GB
|
Storage:
|
232 GB HDD SATA WDC WD2502ABYS-1 (driver: ahci, path: /dev/disk/by-id/wwn-0x50014ee158d2867d)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
- eth1/eno2, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
- ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
- ib0.8100, InfiniBand, configured rate: 40 Gbps, model: N/A, driver: mlx4_core
- ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment
|
- uvb-[32] (1 node, 2 cpus, 12 cores)
Model:
|
Dell PowerEdge C6100
|
Date of arrival:
|
2011-01-04
|
CPU:
|
Intel Xeon X5670 (Westmere, 2.93GHz, 2 CPUs/node, 6 cores/CPU)
|
Memory:
|
96 GB
|
Storage:
|
232 GB HDD SATA WDC WD2502ABYS-1 (driver: ahci, path: /dev/disk/by-id/wwn-0x50014ee1037db989)
|
Network:
|
- eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
- eth1/eno2, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
- ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
- ib0.8100, InfiniBand, configured rate: 40 Gbps, model: N/A, driver: mlx4_core
- ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment
|
Last generated from the Grid'5000 Reference API on 2018-06-20 (commit 0c616d25b)