Grenoble:Hardware: Difference between revisions

From Grid5000
Jump to navigation Jump to search
No edit summary
No edit summary
Line 22: Line 22:
|[[#troll|troll]]||||2019-12-23||4||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;5218||16&nbsp;cores/CPU||384&nbsp;GiB + 1.5&nbsp;TiB [[PMEM]]||data-sort-value="1937"|<b>480&nbsp;GB&nbsp;SSD</b> +&nbsp;1.6&nbsp;TB&nbsp;SSD||data-sort-value="110000"|10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path
|[[#troll|troll]]||||2019-12-23||4||2&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;5218||16&nbsp;cores/CPU||384&nbsp;GiB + 1.5&nbsp;TiB [[PMEM]]||data-sort-value="1937"|<b>480&nbsp;GB&nbsp;SSD</b> +&nbsp;1.6&nbsp;TB&nbsp;SSD||data-sort-value="110000"|10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path
|-
|-
|[[#yeti|yeti]]||||2018-01-16||4||4&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||768&nbsp;GiB||data-sort-value="7526"|<b>480&nbsp;GB&nbsp;SSD</b> +&nbsp;3&nbsp;x&nbsp;2.0&nbsp;TB&nbsp;HDD +&nbsp;1.6&nbsp;TB&nbsp;SSD||data-sort-value="110000"|10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path
|[[#yeti|yeti]]||||2018-01-16||4||4&nbsp;x&nbsp;Intel&nbsp;Xeon&nbsp;Gold&nbsp;6130||16&nbsp;cores/CPU||768&nbsp;GiB||data-sort-value="7526"|<b>480&nbsp;GB&nbsp;SSD</b> +&nbsp;3&nbsp;x&nbsp;2.0&nbsp;TB&nbsp;HDD[[Disk_reservation|*]] +&nbsp;1.6&nbsp;TB&nbsp;SSD||data-sort-value="110000"|10&nbsp;Gbps&nbsp;+&nbsp;100&nbsp;Gbps&nbsp;Omni-Path


|}
|}
 
''*: disk is [[Disk_reservation|reservable]]''
= Cluster details =
= Cluster details =


Line 157: Line 157:
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
|-
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2020-07-22 ([https://github.com/grid5000/reference-repository/commit/a1171d3aa commit a1171d3aa])</small>''
|}''<small>Last generated from the Grid'5000 Reference API on 2020-10-08 ([https://github.com/grid5000/reference-repository/commit/eb4af9a00 commit eb4af9a00])</small>''

Revision as of 16:46, 8 October 2020

Summary

3 clusters, 40 nodes, 1408 cores, 90.7 TFLOPS

Cluster Access Condition Date of arrival Nodes CPU Cores Memory Storage Network
dahu 2018-03-22 32 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GiB 240 GB SSD + 480 GB SSD + 4.0 TB HDD 10 Gbps + 100 Gbps Omni-Path
troll 2019-12-23 4 2 x Intel Xeon Gold 5218 16 cores/CPU 384 GiB + 1.5 TiB PMEM 480 GB SSD + 1.6 TB SSD 10 Gbps + 100 Gbps Omni-Path
yeti 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GiB 480 GB SSD + 3 x 2.0 TB HDD* + 1.6 TB SSD 10 Gbps + 100 Gbps Omni-Path

*: disk is reservable

Cluster details

dahu

32 nodes, 64 cpus, 1024 cores (json)

Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GiB
Storage:
  • 240 GB SSD SATA Samsung MZ7KM240HMHQ0D3 (path: /dev/disk/by-path/pci-0000:00:11.5-ata-3) (primary disk)
  • 480 GB SSD SATA Samsung MZ7KM480HMHQ0D3 (path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 4.0 TB HDD SATA Seagate ST4000NM0265-2DC (path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

troll

4 nodes, 8 cpus, 128 cores (json)

Model: Dell PowerEdge R640
Date of arrival: 2019-12-23
CPU: Intel Xeon Gold 5218 (Cascade Lake-SP, 2.30GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 384 GiB + 1.5 TiB PMEM
Storage:
  • 480 GB SSD SATA Micron MTFDDAK480TDN (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0) (primary disk)
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (path: /dev/disk/by-path/pci-0000:d8:00.0-nvme-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core
  • eth1/eno2, Ethernet, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti

4 nodes, 16 cpus, 256 cores, split as follows due to differences between nodes (json)

yeti-[1-2,4] (3 nodes, 12 cpus, 192 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 480 GB SSD SAS Intel SSDSC2KG480G7R (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0) (primary disk)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti-3 (1 node, 4 cpus, 64 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 480 GB SSD SAS Intel SSDSC2KG480G8R (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0) (primary disk)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

Last generated from the Grid'5000 Reference API on 2020-10-08 (commit eb4af9a00)