Difference between revisions of "Grenoble:Hardware"

From Grid5000
Jump to: navigation, search
Line 17: Line 17:
 
!data-sort-type="number"|Network
 
!data-sort-type="number"|Network
 
|-
 
|-
|[[#dahu_.28testing_queue.29|dahu]]||testing||2018-03-22||72||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||192&nbsp;GB||data-sort-value="4396"|[1-68]: 223&nbsp;GB&nbsp;HDD +&nbsp;447&nbsp;GB&nbsp;HDD +&nbsp;3.639&nbsp;TB&nbsp;HDD<br />[69-72]: 2&nbsp;x&nbsp;447&nbsp;GB&nbsp;HDD +&nbsp;3.639&nbsp;TB&nbsp;HDD||data-sort-value="10000"|10&nbsp;Gbps&nbsp;
+
|[[#dahu_.28testing_queue.29|dahu]]||testing||2018-03-22||72||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||192&nbsp;GB||data-sort-value="4396"|[1-68]: 223&nbsp;GB&nbsp;HDD +&nbsp;447&nbsp;GB&nbsp;HDD +&nbsp;3.639&nbsp;TB&nbsp;HDD<br />[69-72]: 2&nbsp;x&nbsp;447&nbsp;GB&nbsp;HDD +&nbsp;3.639&nbsp;TB&nbsp;HDD||data-sort-value="110000"|[1-10,<wbr>12-72]: 10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path<br />11: 10&nbsp;Gbps&nbsp;
 
|-
 
|-
 
|[[#edel|edel]]||default||2008-10-03||68||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;E5520||4&nbsp;cores/CPU||24&nbsp;GB||data-sort-value="119"|[1,<wbr>5-6,<wbr>8-9,<wbr>12,<wbr>15-16,<wbr>19,<wbr>21,<wbr>23-25,<wbr>28-29,<wbr>32,<wbr>35,<wbr>37,<wbr>39-43,<wbr>46,<wbr>48-50,<wbr>52,<wbr>55-57,<wbr>59,<wbr>62-63,<wbr>65-72]: 119&nbsp;GB&nbsp;SSD<br />[2-4,<wbr>7,<wbr>10-11,<wbr>13-14,<wbr>17-18,<wbr>20,<wbr>22,<wbr>26,<wbr>30-31,<wbr>33-34,<wbr>36,<wbr>38,<wbr>44,<wbr>47,<wbr>51,<wbr>53,<wbr>58,<wbr>60,<wbr>64]: 59&nbsp;GB&nbsp;SSD||data-sort-value="41000"|1&nbsp;Gbps&nbsp;+&nbsp;40&nbsp;Gbps&nbsp;InfiniBand
 
|[[#edel|edel]]||default||2008-10-03||68||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;E5520||4&nbsp;cores/CPU||24&nbsp;GB||data-sort-value="119"|[1,<wbr>5-6,<wbr>8-9,<wbr>12,<wbr>15-16,<wbr>19,<wbr>21,<wbr>23-25,<wbr>28-29,<wbr>32,<wbr>35,<wbr>37,<wbr>39-43,<wbr>46,<wbr>48-50,<wbr>52,<wbr>55-57,<wbr>59,<wbr>62-63,<wbr>65-72]: 119&nbsp;GB&nbsp;SSD<br />[2-4,<wbr>7,<wbr>10-11,<wbr>13-14,<wbr>17-18,<wbr>20,<wbr>22,<wbr>26,<wbr>30-31,<wbr>33-34,<wbr>36,<wbr>38,<wbr>44,<wbr>47,<wbr>51,<wbr>53,<wbr>58,<wbr>60,<wbr>64]: 59&nbsp;GB&nbsp;SSD||data-sort-value="41000"|1&nbsp;Gbps&nbsp;+&nbsp;40&nbsp;Gbps&nbsp;InfiniBand
Line 23: Line 23:
 
|[[#genepi|genepi]]||default||2008-10-01||31||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;E5420||4&nbsp;cores/CPU||8&nbsp;GB||data-sort-value="153"|153&nbsp;GB&nbsp;HDD||data-sort-value="21000"|1&nbsp;Gbps&nbsp;+&nbsp;20&nbsp;Gbps&nbsp;InfiniBand
 
|[[#genepi|genepi]]||default||2008-10-01||31||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;E5420||4&nbsp;cores/CPU||8&nbsp;GB||data-sort-value="153"|153&nbsp;GB&nbsp;HDD||data-sort-value="21000"|1&nbsp;Gbps&nbsp;+&nbsp;20&nbsp;Gbps&nbsp;InfiniBand
 
|-
 
|-
|[[#yeti_.28testing_queue.29|yeti]]||testing||2018-01-16||4||4&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||768&nbsp;GB||data-sort-value="6032"|446&nbsp;GB&nbsp;SSD +&nbsp;3&nbsp;x&nbsp;1.819&nbsp;TB&nbsp;HDD||data-sort-value="10000"|10&nbsp;Gbps&nbsp;
+
|[[#yeti_.28testing_queue.29|yeti]]||testing||2018-01-16||4||4&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||768&nbsp;GB||data-sort-value="6032"|446&nbsp;GB&nbsp;SSD +&nbsp;3&nbsp;x&nbsp;1.819&nbsp;TB&nbsp;HDD||data-sort-value="110000"|10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path
  
 
|}
 
|}
Line 32: Line 32:
  
 
'''72 nodes, 144 cpus, 2304 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/dahu/nodes.json?pretty=1 json])
 
'''72 nodes, 144 cpus, 2304 cores,''' split as follows due to differences between nodes ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/dahu/nodes.json?pretty=1 json])
; dahu-[1-9,<wbr>11-16,<wbr>18-27,<wbr>29,<wbr>31-55,<wbr>61-65,<wbr>67-68] (58 nodes, 116 cpus, 1856 cores)
+
; dahu-[1-10,<wbr>12-27,<wbr>29-56,<wbr>61-68] (62 nodes, 124 cpus, 1984 cores)
 
{|
 
{|
 
|-
 
|-
Line 56: Line 56:
 
|  
 
|  
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
+
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 +
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
 
|-
 
|-
 
|}
 
|}
 
<hr style="height:10pt; visibility:hidden;" />
 
<hr style="height:10pt; visibility:hidden;" />
  
; dahu-[10,<wbr>30,<wbr>56,<wbr>66] (4 nodes, 8 cpus, 128 cores)
+
; dahu-11 (1 node, 2 cpus, 32 cores)
 
{|
 
{|
 
|-
 
|-
Line 84: Line 85:
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Network:'''
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Network:'''
 
|  
 
|  
* enp24s0f1/enp24s0f1, Ethernet, configured rate: , model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
+
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br/>
+
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
|-
 
|}
 
<hr style="height:10pt; visibility:hidden;" />
 
 
 
; dahu-17 (1 node, 2 cpus, 32 cores)
 
{|
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Model:'''
 
| Dell PowerEdge C6420<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Date of arrival:'''
 
| 2018-03-22<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''CPU:'''
 
| Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2&nbsp;CPUs/node, 16&nbsp;cores/CPU)<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Memory:'''
 
| 192&nbsp;GB<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Storage:'''
 
|
 
* 223&nbsp;GB HDD SATA MZ7KM240HMHQ0D3  (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)<br />
 
* 447&nbsp;GB HDD SATA MZ7KM480HMHQ0D3  (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)<br />
 
* 3.639&nbsp;TB HDD SATA ST4000NM0265-2DC  (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)<br/>
 
|-
 
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Network:'''
 
|
 
* eno16/eno16, Ethernet, configured rate: 1&nbsp;Gbps, model: Intel I350 Gigabit Network Connection, driver: igb<br />
 
* enp24s0f1/enp24s0f1, Ethernet, configured rate: , model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br/>
 
 
|-
 
|-
 
|}
 
|}
Line 144: Line 115:
 
|  
 
|  
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
+
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 +
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
 
|-
 
|-
 
|}
 
|}
Line 173: Line 145:
 
|  
 
|  
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
+
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 +
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
 
|-
 
|-
 
|}
 
|}
Line 202: Line 175:
 
|  
 
|  
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
 
* eth0/enp24s0f0, Ethernet, configured rate: 10&nbsp;Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e<br />
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
+
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 +
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
 
|-
 
|-
 
|}
 
|}
Line 490: Line 464:
 
* <span style="color:grey">eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 
* <span style="color:grey">eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 
* <span style="color:grey">eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 
* <span style="color:grey">eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br/>
+
* <span style="color:grey">eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
 +
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
 
|-
 
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2018-06-22 ([https://github.com/grid5000/reference-repository/commit/f97667273 commit f97667273])</small>''
+
|}''<small>Last generated from the Grid'5000 Reference API on 2018-07-06 ([https://github.com/grid5000/reference-repository/commit/47769a550 commit 47769a550])</small>''

Revision as of 16:12, 6 July 2018


Summary

4 clusters, 175 nodes, 3352 cores, 5.3 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
dahu testing 2018-03-22 72 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GB [1-68]: 223 GB HDD + 447 GB HDD + 3.639 TB HDD
[69-72]: 2 x 447 GB HDD + 3.639 TB HDD
[1-10,12-72]: 10 Gbps + 100 Gbps Omni-Path
11: 10 Gbps 
edel default 2008-10-03 68 2 x Intel Xeon E5520 4 cores/CPU 24 GB [1,5-6,8-9,12,15-16,19,21,23-25,28-29,32,35,37,39-43,46,48-50,52,55-57,59,62-63,65-72]: 119 GB SSD
[2-4,7,10-11,13-14,17-18,20,22,26,30-31,33-34,36,38,44,47,51,53,58,60,64]: 59 GB SSD
1 Gbps + 40 Gbps InfiniBand
genepi default 2008-10-01 31 2 x Intel Xeon E5420 4 cores/CPU 8 GB 153 GB HDD 1 Gbps + 20 Gbps InfiniBand
yeti testing 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GB 446 GB SSD + 3 x 1.819 TB HDD 10 Gbps + 100 Gbps Omni-Path

Cluster details

dahu (testing queue)

72 nodes, 144 cpus, 2304 cores, split as follows due to differences between nodes (json)

dahu-[1-10,12-27,29-56,61-68] (62 nodes, 124 cpus, 1984 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

dahu-11 (1 node, 2 cpus, 32 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment

dahu-28 (1 node, 2 cpus, 32 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA TOSHIBA MG04ACA4 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

dahu-[57-60] (4 nodes, 8 cpus, 128 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 223 GB HDD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA SSDSC2KG480G7R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA TOSHIBA MG04ACA4 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

dahu-[69-72] (4 nodes, 8 cpus, 128 cores)
Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GB
Storage:
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 447 GB HDD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 3.639 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

edel

68 nodes, 136 cpus, 544 cores, split as follows due to differences between nodes (json)

edel-[1,5-6,16,37,41-42] (7 nodes, 14 cpus, 56 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, - unavailable for experiment

edel-2 (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, - unavailable for experiment

edel-[3,7,10-11,13-14,22,26,30,36,38,44,47,51,53,58,60,64] (18 nodes, 36 cpus, 144 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[8-9,12,15,19,21,23-24,28-29,32,39-40,43,48-49,52,55-57,59,62,65-71] (29 nodes, 58 cpus, 232 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[17,34] (2 nodes, 4 cpus, 16 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C300-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[4,18,31,33] (4 nodes, 8 cpus, 32 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-20 (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-id/wwn-0x500a07510202c04b)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

edel-[25,35,46,50,63,72] (6 nodes, 12 cpus, 48 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/enp1s0f1, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment

genepi

31 nodes, 62 cpus, 248 cores (json)

Model: Bull R422-E1
Date of arrival: 2008-10-01
CPU: Intel Xeon E5420 (Harpertown, 2.50GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 8 GB
Storage: 153 GB HDD SATA WDC WD1600YS-01S (driver: ata_piix, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp5s0f0, Ethernet, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e - unavailable for experiment
  • eth1/enp5s0f1, Ethernet, configured rate: 1 Gbps, model: Intel 80003ES2LAN Gigabit Ethernet Controller (Copper), driver: e1000e
  • ib0, InfiniBand, configured rate: 20 Gbps, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26418 [ConnectX VPI PCIe 2.0 5GT/s - IB DDR / 10GigE], driver: mlx4_core - unavailable for experiment

yeti (testing queue)

4 nodes, 16 cpus, 256 cores (json)

Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GB
Storage:
  • 446 GB SSD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:0:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:1:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:2:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:3:0)
Network:
  • eth0/eno113, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1
Last generated from the Grid'5000 Reference API on 2018-07-06 (commit 47769a550)