Grenoble:Hardware

From Grid5000
Revision as of 15:53, 18 April 2018 by Dloup (talk | contribs)
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


Summary

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
edel default 2008-10-03 68 2 x Intel Xeon E5520 4 cores/CPU 24 GB [1,5-6,8-9,12,15-16,19,21,23-25,28-29,32,35,37,39-43,46,48-50,52,55-57,59,62-63,65-72]: 119 GB SSD
[2-4,7,10-11,13-14,17-18,20,22,26,30-31,33-34,36,38,44,47,51,53,58,60,64]: 59 GB SSD
1 Gbps + 40 Gbps InfiniBand
genepi default 2008-10-01 31 2 x Intel Xeon E5420 4 cores/CPU 8 GB 153 GB HDD 1 Gbps + 20 Gbps InfiniBand

Cluster details

edel

68 nodes, 136 cpus, 544 cores, split as follows due to differences between nodes (json)

edel-[1,5-6,8-9,12,15-16,19,21,23,28-29,32,37,39-40,42-43,48-49,52,55-57,59,62,65-69,71] (33 nodes, 66 cpus, 264 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[2-4,7,10-11,13-14,18,20,22,26,30-31,33,36,38,44,47,51,53,58,60,64] (24 nodes, 48 cpus, 192 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[17,34] (2 nodes, 4 cpus, 16 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C300-MTFDDAA064M (driver: ahci)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[24-25,35,41,46,50,63,70,72] (9 nodes, 18 cpus, 72 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci)
Network:
  • eth0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

genepi

31 nodes, 62 cpus, 248 cores (json)

Model: Bull R422-E1
Date of arrival: 2008-10-01
CPU: Intel Xeon E5420 Harpertown 2.50GHz (2 CPUs/node, 4 cores/CPU)
Memory: 8 GB
Storage: 153 GB HDD SATA WDC WD1600YS-01S (driver: ata_piix)
Network:
  • eth0/enp5s0f0, Ethernet (driver: e1000e), configured rate: n/c - unavailable for experiment
  • eth1/enp5s0f1, Ethernet (driver: e1000e), configured rate: 1 Gbps
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 20 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

Generated from the Grid5000 APIs on 2018-04-18