Difference between revisions of "Grenoble:Hardware"

From Grid5000
Jump to: navigation, search
Line 30: Line 30:
  
 
'''68 nodes, 136 cpus, 544 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/edel/nodes.json?pretty=1 json])
 
'''68 nodes, 136 cpus, 544 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/edel/nodes.json?pretty=1 json])
; edel-[1,<wbr>5-6,<wbr>16,<wbr>37,<wbr>42] (6 nodes, 12 cpus, 48 cores)
+
; edel-[1,<wbr>5-6,<wbr>16,<wbr>37,<wbr>41-42] (7 nodes, 14 cpus, 56 cores)
 
{|
 
{|
 
|-
 
|-
Line 114: Line 114:
 
<hr style="height:10pt; visibility:hidden;" />
 
<hr style="height:10pt; visibility:hidden;" />
  
; edel-[8-9,<wbr>12,<wbr>15,<wbr>19,<wbr>21,<wbr>23,<wbr>28-29,<wbr>32,<wbr>39-40,<wbr>43,<wbr>48-49,<wbr>52,<wbr>55-57,<wbr>59,<wbr>62,<wbr>65-69,<wbr>71] (27 nodes, 54 cpus, 216 cores)
+
; edel-[8-9,<wbr>12,<wbr>15,<wbr>19,<wbr>21,<wbr>23-24,<wbr>28-29,<wbr>32,<wbr>39-40,<wbr>43,<wbr>48-49,<wbr>52,<wbr>55-57,<wbr>59,<wbr>62,<wbr>65-71] (29 nodes, 58 cpus, 232 cores)
 
{|
 
{|
 
|-
 
|-
Line 192: Line 192:
 
* eth0/enp1s0f0, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb<br />
 
* eth0/enp1s0f0, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb<br />
 
* <span style="color:grey">eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment</span><br />
 
* <span style="color:grey">eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment</span><br />
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: N/A, driver: mlx4_core<br />
+
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core<br />
* <span style="color:grey">ib1, InfiniBand, driver: mlx4_core - unavailable for experiment</span><br/>
+
* <span style="color:grey">ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment</span><br/>
 
|-
 
|-
 
|}
 
|}
Line 222: Line 222:
 
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core<br />
 
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core<br />
 
* <span style="color:grey">ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment</span><br/>
 
* <span style="color:grey">ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment</span><br/>
|-
 
|}
 
<hr style="height:10pt; visibility:hidden;" />
 
 
; edel-[24,<wbr>41,<wbr>70] (3 nodes, 6 cpus, 24 cores)
 
{|
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Model:'''
 
| Bull bullx B500 compute blades<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Date of arrival:'''
 
| 2008-10-03<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''CPU:'''
 
| Intel Xeon E5520 Nehalem  2.27GHz (2&nbsp;CPUs/node, 4&nbsp;cores/CPU)<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Memory:'''
 
| 24&nbsp;GB<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Storage:'''
 
|  119&nbsp;GB SSD SATA C400-MTFDDAA128M  (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Network:'''
 
|
 
* eth0/enp1s0f0, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb<br />
 
* <span style="color:grey">eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment</span><br />
 
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: N/A, driver: mlx4_core<br />
 
* <span style="color:grey">ib1, InfiniBand, driver: mlx4_core - unavailable for experiment</span><br/>
 
 
|-
 
|-
 
|}
 
|}
Line 276: Line 248:
 
* eth0/enp1s0f0, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb<br />
 
* eth0/enp1s0f0, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb<br />
 
* <span style="color:grey">eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment</span><br />
 
* <span style="color:grey">eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment</span><br />
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: N/A, driver: mlx4_core<br />
+
* ib0, InfiniBand, configured rate: 40&nbsp;Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core<br />
* <span style="color:grey">ib1, InfiniBand, driver: mlx4_core - unavailable for experiment</span><br/>
+
* <span style="color:grey">ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment</span><br/>
 
|-
 
|-
 
|}
 
|}
Line 341: Line 313:
 
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
 
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
 
|-
 
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2018-05-19 ([https://github.com/grid5000/reference-repository/commit/6cc091305 commit 6cc091305])</small>''
+
|}''<small>Last generated from the Grid'5000 Reference API on 2018-05-28 ([https://github.com/grid5000/reference-repository/commit/7464e98fc commit 7464e98fc])</small>''

Revision as of 09:49, 28 May 2018


Summary

3 clusters, 103 nodes, 1048 cores, 5.3 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
edel default 2008-10-03 68 2 x Intel Xeon E5520 4 cores/CPU 24 GB [1,5-6,8-9,12,15-16,19,21,23-25,28-29,32,35,37,39-43,46,48-50,52,55-57,59,62-63,65-72]: 119 GB SSD
[2-4,7,10-11,13-14,17-18,20,22,26,30-31,33-34,36,38,44,47,51,53,58,60,64]: 59 GB SSD
1 Gbps + 40 Gbps InfiniBand
genepi default 2008-10-01 31 2 x Intel Xeon E5420 4 cores/CPU 8 GB 153 GB HDD 1 Gbps + 20 Gbps InfiniBand
yeti testing 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GB 446 GB SSD + 3 x 1.819 TB HDD 10 Gbps 

Cluster details

edel

68 nodes, 136 cpus, 544 cores, split as follows due to differences between nodes (json)

edel-[1,5-6,16,37,41-42] (7 nodes, 14 cpus, 56 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, driver: mlx4_core - unavailable for experiment

edel-[2] (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, driver: mlx4_core - unavailable for experiment

edel-[3,7,10-11,13-14,22,26,30,36,38,44,47,51,53,58,60,64] (18 nodes, 36 cpus, 144 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[8-9,12,15,19,21,23-24,28-29,32,39-40,43,48-49,52,55-57,59,62,65-71] (29 nodes, 58 cpus, 232 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[17,34] (2 nodes, 4 cpus, 16 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C300-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[4,18,31,33] (4 nodes, 8 cpus, 32 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[20] (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-id/wwn-0x500a07510202c04b)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[25,35,46,50,63,72] (6 nodes, 12 cpus, 48 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

genepi

31 nodes, 62 cpus, 248 cores (json)

Model: Bull R422-E1
Date of arrival: 2008-10-01
CPU: Intel Xeon E5420 Harpertown 2.50GHz (2 CPUs/node, 4 cores/CPU)
Memory: 8 GB
Storage: 153 GB HDD SATA WDC WD1600YS-01S (driver: ata_piix, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp5s0f0, Ethernet, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e - unavailable for experiment
  • eth1/enp5s0f1, Ethernet, configured rate: 1 Gbps, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e
  • ib0, InfiniBand, configured rate: 20 Gbps, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core - unavailable for experiment

yeti (testing queue)

4 nodes, 16 cpus, 256 cores (json)

Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 Skylake 2.10GHz (4 CPUs/node, 16 cores/CPU)
Memory: 768 GB
Storage:
  • 446 GB SSD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:0:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:1:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:2:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:3:0)
Network:
  • eth0/eno113, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
Last generated from the Grid'5000 Reference API on 2018-05-28 (commit 7464e98fc)