Template:Site description

From Grid5000
Revision as of 10:52, 14 November 2016 by Rpottier (talk | contribs)
Jump to navigation Jump to search

Specific informations of {{{1}}}'s site:

  • [[{{{1}}}:Hardware|Hardware]]
  • [[{{{1}}}:Network|Network]]
  • [[{{{1}}}:Storage|Storage]]
  • [[{{{1}}}:External access|External access]]
  • [[{{{1}}}:People|People]]
  • Administration

Direct access to resources for {{{1}}}'s site:

Shortcuts to global tools and informations:

Latest updated publications Five random publications that benefited from Grid'5000 (at least 2940 overall):

  • Thomas Firmin, Pierre Boulet, El-Ghazali Talbi. Asynchronous Multi-fidelity Hyperparameter Optimization Of Spiking Neural Networks. International Conference on Neuromorphic Systems (ICONS 2024), Jul 2024, Washington, United States. hal-04781629 view on HAL pdf
  • Maxime Gonthier, Samuel Thibault, Loris Marchal. A generic scheduler to foster data locality for GPU and out-of-core task-based applications. 2024. hal-04146714v2 view on HAL pdf
  • Lluis Prior Sancho, Tommaso Belvedere, Marco Tognon. From Pixels to Touch: Direct Tactile Servoing with Learned Photometric Normalization. IEEE Robotics and Automation Letters, In press, 10.1109/LRA.2026.3685944. hal-05599932 view on HAL pdf
  • Donatien Schmitz, Guillaume Rosinosky, Etienne Rivière. Justin: Hybrid CPU/Memory Elastic Scaling for Distributed Stream Processing ⋆. DAIS 2025 - 25th International Conference on Distributed Applications and Interoperable Systems, Daniel Balouek; Ibéria Medeiros, Jun 2025, Lille, France. pp.1-17. hal-05081993 view on HAL pdf
  • Gaspard Michel, Elena Epure, Romain Hennequin, Christophe Cerisara. Evaluating LLMs for Quotation Attribution in Literary Texts: A Case Study of LLaMa3. 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies, Apr 2025, Albuquerque, New Mexico, United States. pp.742-755, 10.18653/v1/2025.naacl-short.62. hal-05245297 view on HAL pdf

Latest updated experiments {{#experiments: 5|{{{1}}}}}

{{{misc}}}