Template:Site description
Jump to navigation
Jump to search
Specific informations of {{{1}}}'s site:
- [[{{{1}}}:Hardware|Hardware]]
- [[{{{1}}}:Network|Network]]
- [[{{{1}}}:Storage|Storage]]
- [[{{{1}}}:External access|External access]]
- [[{{{1}}}:People|People]]
- Administration
Direct access to resources for {{{1}}}'s site:
- reservation state (monika) and reservation history (drawgantt) via OAR
- available resources via Ganglia
- critical services via Nagios
- opened bugs via BugZilla
- support-staff via mail
Shortcuts to global tools and informations:
- support procedures to report bugs or ask for enhancements
- global reservation state and global reservation history via OARgrid.
- registered users via UMS
Latest updated publications Five random publications that benefited from Grid'5000 (at least 2929 overall):
- Vania Marangozova, Angelo Gennuso. K8S Auto-Scaler Coordinators. Université Grenoble - Alpes. 2024. hal-04963348 view on HAL pdf
- Guillaume Rosinosky, Donatien Schmitz, Etienne Rivière. StreamBed: Capacity Planning for Stream Processing. DEBS 2024 - 18th ACM International Conference on Distributed and Event-based Systems, Jun 2024, Lyon, France. pp.90-102, 10.1145/3629104.3666034. hal-04708354 view on HAL pdf
- Rémi Meunier, Thomas Carle, Thierry Monteil. Multi-core interference over-estimation reduction by static scheduling of multi-phase tasks. Real-Time Systems, 2024, pp.1--39. 10.1007/s11241-024-09427-3. hal-04689317 view on HAL pdf
- Louis Roussel, François Lemaire. Deep Learning for Integro-Differential Modelling. 2025. hal-05230281 view on HAL pdf
- Hee-Soo Choi, Priyansh Trivedi, Mathieu Constant, Karën Fort, Bruno Guillaume. Au-delà de la performance des modèles : la prédiction de liens peut-elle enrichir des graphes lexico-sémantiques du français ?. Actes de JEP-TALN-RECITAL 2024. 31ème Conférence sur le Traitement Automatique des Langues Naturelles, volume 1 : articles longs et prises de position, Jul 2024, Toulouse, France. pp.36-49. hal-04623008 view on HAL pdf
Latest updated experiments {{#experiments: 5|{{{1}}}}}
{{{misc}}}