Grenoble:Hardware: Difference between revisions

From Grid5000
Jump to navigation Jump to search
No edit summary
No edit summary
Line 32: Line 32:


'''72 nodes, 144 cpus, 2304 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/dahu/nodes.json?pretty=1 json])
'''72 nodes, 144 cpus, 2304 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/dahu/nodes.json?pretty=1 json])
; dahu-[1-9,<wbr>12-16,<wbr>18-27,<wbr>29,<wbr>31-55,<wbr>61-65,<wbr>67-68] (57 nodes, 114 cpus, 1824 cores)
; dahu-[1-9,<wbr>11-16,<wbr>18-27,<wbr>29,<wbr>31-55,<wbr>61-65,<wbr>67-68] (58 nodes, 116 cpus, 1856 cores)
{|
{|
|-
|-
Line 61: Line 61:
<hr style="height:10pt; visibility:hidden;" />
<hr style="height:10pt; visibility:hidden;" />


; dahu-[10-11,<wbr>30,<wbr>56,<wbr>66] (5 nodes, 10 cpus, 160 cores)
; dahu-[10,<wbr>30,<wbr>56,<wbr>66] (4 nodes, 8 cpus, 128 cores)
{|
{|
|-
|-
Line 492: Line 492:
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
|-
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2018-06-16 ([https://github.com/grid5000/reference-repository/commit/d4e880654 commit d4e880654])</small>''
|}''<small>Last generated from the Grid'5000 Reference API on 2018-06-19 ([https://github.com/grid5000/reference-repository/commit/f36d4670a commit f36d4670a])</small>''

Revision as of 14:51, 19 June 2018


Summary

4 clusters, 175 nodes, 3352 cores, 5.3 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
dahu testing 2018-03-22 72 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GB [1-68]: 223 GB HDD + 447 GB HDD + 3.639 TB HDD
[69-72]: 2 x 447 GB HDD + 3.639 TB HDD
10 Gbps 
edel default 2008-10-03 68 2 x Intel Xeon E5520 4 cores/CPU 24 GB [1,5-6,8-9,12,15-16,19,21,23-25,28-29,32,35,37,39-43,46,48-50,52,55-57,59,62-63,65-72]: 119 GB SSD
[2-4,7,10-11,13-14,17-18,20,22,26,30-31,33-34,36,38,44,47,51,53,58,60,64]: 59 GB SSD
1 Gbps + 40 Gbps InfiniBand
genepi default 2008-10-01 31 2 x Intel Xeon E5420 4 cores/CPU 8 GB 153 GB HDD 1 Gbps + 20 Gbps InfiniBand
yeti testing 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GB 446 GB SSD + 3 x 1.819 TB HDD 10 Gbps 

Cluster details

dahu (testing queue)

72 nodes, 144 cpus, 2304 cores, split as follows due to differences between nodes (json)

dahu-[1-9,11-16,18-27,29,31-55,61-65,67-68] (58 nodes, 116 cpus, 1856 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

dahu-[10,30,56,66] (4 nodes, 8 cpus, 128 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • enp24s0f1/enp24s0f1, Ethernet, configured rate: , model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e

dahu-[17] (1 node, 2 cpus, 32 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eno16/eno16, Ethernet, configured rate: 1 Gbps, model: Intel I350 Gigabit Network Connection, driver: igb
  • enp24s0f1/enp24s0f1, Ethernet, configured rate: , model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e

dahu-[28] (1 node, 2 cpus, 32 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA TOSHIBA MG04ACA4 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

dahu-[57-60] (4 nodes, 8 cpus, 128 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA SSDSC2KG480G7R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA TOSHIBA MG04ACA4 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

dahu-[69-72] (4 nodes, 8 cpus, 128 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

edel

68 nodes, 136 cpus, 544 cores, split as follows due to differences between nodes (json)

edel-[1,5-6,16,37,41-42] (7 nodes, 14 cpus, 56 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, - unavailable for experiment

edel-[2] (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, - unavailable for experiment

edel-[3,7,10-11,13-14,22,26,30,36,38,44,47,51,53,58,60,64] (18 nodes, 36 cpus, 144 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[8-9,12,15,19,21,23-24,28-29,32,39-40,43,48-49,52,55-57,59,62,65-71] (29 nodes, 58 cpus, 232 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[17,34] (2 nodes, 4 cpus, 16 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C300-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[4,18,31,33] (4 nodes, 8 cpus, 32 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[20] (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-id/wwn-0x500a07510202c04b)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[25,35,46,50,63,72] (6 nodes, 12 cpus, 48 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

genepi

31 nodes, 62 cpus, 248 cores (json)

Model: Bull R422-E1
Date of arrival: 2008-10-01
CPU: Intel Xeon E5420 (Harpertown, 2.50GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 8 GB
Storage: 153 GB HDD SATA WDC WD1600YS-01S (driver: ata_piix, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp5s0f0, Ethernet, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e - unavailable for experiment
  • eth1/enp5s0f1, Ethernet, configured rate: 1 Gbps, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e
  • ib0, InfiniBand, configured rate: 20 Gbps, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core - unavailable for experiment

yeti (testing queue)

4 nodes, 16 cpus, 256 cores (json)

Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GB
Storage:
  • 446 GB SSD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:0:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:1:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:2:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:3:0)
Network:
  • eth0/eno113, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

Last generated from the Grid'5000 Reference API on 2018-06-19 (commit f36d4670a)