Modules: Difference between revisions

From Grid5000
Jump to navigation Jump to search
 
(18 intermediate revisions by 8 users not shown)
Line 3: Line 3:
{{Pages|HPC}}
{{Pages|HPC}}
{{TutorialHeader}}
{{TutorialHeader}}
Grid'5000 provides a set of software (mainly scientific-related) using '''[http://modules.sourceforge.net/ Environment modules]''', thanks to the '''module''' command line tool.
Grid'5000 provides a set of software (mainly scientific-related) using '''[https://lmod.readthedocs.io/en/latest/ Lmod]''', thanks to the '''module''' command line tool.


They are available from Grid5000 frontends or cluster's nodes (only on standard, big, and nfs environment if deployment is used).
They are available from Grid5000 frontends or cluster's nodes (only on '''<code class="file">debianXX-big</code>''' and '''<code class="file">debianXX-nfs</code>''' environment if deployment is used).


= The modules command=
= The modules command=
Line 13: Line 13:
To get started, list available software using:  
To get started, list available software using:  


$ module avail
  $ module av
--------------------------------------------- /grid5000/spack/share/spack/modules/linux-debian9-x86_64 ---------------------------------------------
 
autoconf/2.69_gcc-6.4.0                       gcc/8.3.0_gcc-6.4.0                           magma/2.3.0_gcc-6.4.0
  ----------------------------------------- /grid5000/spack/v1/share/spack/modules/linux-debian11-x86_64_v2 -----------------------------------------------------------
automake/1.16.1_gcc-6.4.0                     gmp/6.1.2_gcc-6.4.0                           memkind/1.7.0_gcc-6.4.0
  apptainer/1.0.2_gcc-10.4.0                          intel-oneapi-ipp/2021.6.0_gcc-10.4.0                 py-numpy/1.23.3_gcc-10.4.0-intelmpi-python-3.9.13
boost/1.69.0_gcc-6.4.0                         hwloc/1.11.11_gcc-6.4.0                        miniconda2/4.5.11_gcc-6.4.0
  cmake/3.23.3_gcc-10.4.0                              intel-oneapi-ipp/2021.7.0_gcc-10.4.0         (D)    py-pyopencl/2020.2.2_gcc-10.4.0-intelmpi-python-3.9.13
cmake/3.13.4_gcc-6.4.0                         hwloc/2.0.2_gcc-6.4.0                         miniconda3/4.5.11_gcc-6.4.0
  comgr/5.2.0_gcc-10.4.0                              intel-oneapi-mkl/2022.1.0_gcc-10.4.0-intelmpi        python/3.9.13_gcc-10.4.0
cuda/10.0.130_gcc-6.4.0(default)               intel-mkl/2017.4.239_gcc-6.4.0                 mpfr/4.0.1_gcc-6.4.0
  cube/4.6_gcc-10.4.0                                  intel-oneapi-mkl/2022.1.0_gcc-10.4.0-openmpi        rdma-core/41.0_gcc-10.4.0
cuda/7.5.18_gcc-6.4.0                         intel-mkl/2018.1.163_gcc-6.4.0                 netlib-lapack/3.8.0_gcc-6.4.0
  cuda/11.4.0_gcc-10.4.0                               intel-oneapi-mkl/2023.0.0_gcc-10.4.0-intelmpi (D)    rocblas/5.2.0_gcc-10.4.0
cuda/8.0.61_gcc-6.4.0                          intel-mkl/2019.1.144_gcc-6.4.0                 netlib-xblas/1.0.248_gcc-6.4.0
  cuda/11.6.2_gcc-10.4.0                               intel-oneapi-mpi/2021.6.0_gcc-10.4.0                 rocm-cmake/5.2.0_gcc-10.4.0
cuda/9.0.176_gcc-6.4.0                         intel-mpi/2019.1.144_gcc-6.4.0                 numactl/2.0.12_gcc-6.4.0
  cuda/11.7.1_gcc-10.4.0                        (D)    intel-oneapi-mpi/2021.8.0_gcc-10.4.0          (D)    rocm-opencl/5.2.0_gcc-10.4.0
cuda/9.1.85_gcc-6.4.0                         intel-parallel-studio/cluster.2019.2_gcc-6.4.0 openblas/0.3.5_gcc-6.4.0
  cudnn/8.2.4.15-11.4_gcc-10.4.0                      intel-oneapi-tbb/2021.8.0_gcc-10.4.0                 rocm-openmp-extras/5.2.0_gcc-10.4.0
cuda/9.2.88_gcc-6.4.0                         intel-tbb/2019.2_gcc-6.4.0                     openmpi/3.1.3_gcc-6.4.0
  cudnn/8.4.0.27-11.6_gcc-10.4.0               (D)    intel-oneapi-vtune/2022.3.0_gcc-10.4.0              rocm-smi-lib/5.2.3_gcc-10.4.0
cudnn/5.1_gcc-6.4.0                           isl/0.19_gcc-6.4.0                             openmpi/4.0.1_gcc-6.4.0
  dbus/1.13.6_gcc-10.4.0                               intel-oneapi-vtune/2023.0.0_gcc-10.4.0       (D)   rocminfo/5.2.0_gcc-10.4.0
cudnn/6.0_gcc-6.4.0                           jdk/11.0.2_9_gcc-6.4.0                         papi/5.6.0_gcc-6.4.0
  gcc/10.4.0_gcc-10.4.0                                julia/1.8.2_gcc-10.4.0                              rocprofiler-dev/5.2.0_gcc-10.4.0
cudnn/7.3_gcc-6.4.0                           libfabric/1.7.1_gcc-6.4.0                     swig/3.0.12_gcc-6.4.0
  gcc/12.2.0_gcc-10.4.0                         (D)    libfabric/1.15.1_gcc-10.4.0                          rocsolver/5.2.0_gcc-10.4.0
gcc/6.4.0_gcc-6.4.0                           libfabric/1.8.0_gcc-6.4.0                      tar/1.31_gcc-6.4.0
  glpk/4.65_gcc-10.4.0                                llvm/13.0.1_gcc-10.4.0                              roctracer-dev-api/5.2.0_gcc-10.4.0
gcc/6.5.0_gcc-6.4.0                           likwid/4.3.2_gcc-6.4.0
  go/1.18_gcc-10.4.0                                   llvm-amdgpu/5.2.0_gcc-10.4.0                        rust/1.65.0_gcc-10.4.0
gcc/7.4.0_gcc-6.4.0                           llvm/7.0.1_gcc-6.4.0
  hip/5.2.0_gcc-10.4.0                                metis/5.1.0_gcc-10.4.0                              scalasca/2.6_gcc-10.4.0-openmpi
  hipblas/5.2.0_gcc-10.4.0                            mpich/4.0.2_gcc-10.4.0-cuda                         scorep/7.1_gcc-10.4.0-openmpi
  hsa-rocr-dev/5.2.0_gcc-10.4.0                       mpich/4.0.2_gcc-10.4.0                        (D)    scotch/7.0.1_gcc-10.4.0-intelmpi
  hsakmt-roct/5.2.0_gcc-10.4.0                        mumps/5.4.1_gcc-10.4.0-intelmpi                      singularity/3.8.5_gcc-10.4.0
  intel-oneapi-advisor/2022.1.0_gcc-10.4.0             mvapich2/2.3.7_gcc-10.4.0                           singularity/3.8.7_gcc-10.4.0                           (D)
  intel-oneapi-advisor/2023.0.0_gcc-10.4.0     (D)    netlib-lapack/3.10.1_gcc-10.4.0                     starpu/1.3.9_gcc-10.4.0
  intel-oneapi-ccl/2021.6.0_gcc-10.4.0-intelmpi        netlib-scalapack/2.2.0_gcc-10.4.0-openmpi            ucx/1.13.1_gcc-10.4.0-compat
  intel-oneapi-ccl/2021.8.0_gcc-10.4.0-intelmpi (D)    opa-psm2/11.2.230_gcc-10.4.0                        ucx/1.13.1_gcc-10.4.0                                 (D)
  intel-oneapi-compilers/2022.1.0_gcc-10.4.0           openblas/0.3.20_gcc-10.4.0                           valgrind/3.19.0_gcc-10.4.0-intelmpi
  intel-oneapi-compilers/2023.0.0_gcc-10.4.0   (D)    openjdk/1.8.0_265-b01_gcc-10.4.0                     valgrind/3.19.0_gcc-10.4.0-openmpi
  intel-oneapi-dpl/2021.7.0_gcc-10.4.0                 openmpi/4.1.4_gcc-10.4.0                             valgrind/3.20.0_gcc-10.4.0-intelmpi                    (D)
  intel-oneapi-dpl/2022.0.0_gcc-10.4.0         (D)    openmpi/4.1.5_gcc-10.4.0                      (D)    vtk/9.0.3_gcc-10.4.0-intelmpi
  intel-oneapi-inspector/2022.1.0_gcc-10.4.0           petsc/3.17.4_gcc-10.4.0-intelmpi                    vtk/9.0.3_gcc-10.4.0-openmpi                          (D)
  intel-oneapi-inspector/2023.0.0_gcc-10.4.0   (D)    pmix/4.1.2_gcc-10.4.0
 
  ----------------------------------------- /grid5000/spack/modules-others/modules --------------------------------------------------------------------------------------
  conda/23.3.1    conda/23.5.0 (D)    ddt/20.1.2    matlab/R2022a    matlab/R2022b (D)    matlab-runtime/R2022a    matlab-runtime/R2022b (D)
 
    Where:
    D:  Default Module
 
  Use "module spider" to find all possible modules.
  Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".
 
 
{{Note|text=The complete list of modules with quick description and installed versions is available on the page [[Modules List]].<br/>
If you need additional software to be installed, feel free to contact [mailto:support-staff@lists.grid5000.fr Grid5000 support team] and we can look into it.}}
 


{{Note|text=If you need additional software to be installed, feel free to contact [mailto:support-starff@lists.grid5000.fr Grid5000 support team] and we can look into it.}}


To load something into your environment use the ''load'' command:
To load something into your environment use the ''load'' command:
Line 38: Line 64:
  $ module load gcc
  $ module load gcc
  $ gcc --version
  $ gcc --version
  gcc (GCC) 8.3.0
  gcc (Spack GCC) 12.2.0


By default, module loads the latest version available of a software (sorted by lexicographical order). You can also specify the version you need:
By default, module loads the latest version available of a software (sorted by lexicographical order). You can also specify the version you need:


  $ module load gcc/7.4.0_gcc-6.4.0
  $ module load gcc/10.4.0_gcc-10.4.0
  $ gcc --version
  $ gcc --version
  gcc (GCC) 7.4.0
  gcc (Spack GCC) 10.4.0


You can also find out more information about the package using ''whatis'' or ''show'' commands:
You can also find out more information about the package using ''whatis'' or ''show'' commands:
Line 58: Line 84:
The full documentation of module command is available at: http://modules.sourceforge.net/
The full documentation of module command is available at: http://modules.sourceforge.net/


== Using modules in jobs ==
== Modules that require connection to license server ==
The ''module'' command is not a real executable, but a shell function.
 
Some modules are licensed and require a connection to a license server.
We currently propose two license protected modules, Matlab and DDT/Map, with easier access to some institutional license servers.
To use them, you just need to join the correct group from the [https://api.grid5000.fr/stable/users/#myaccount '''User Management Service'''].
{{:Generated/Licenses}}
 
We plan to add other institutional license servers on this list.
 
 
=== MatLab example ===
 
* Make sure you are a member of a MatLab group in the [https://api.grid5000.fr/stable/users/#myaccount '''User Management Service''']. If not, ask to join the group.
* Set up the link to the MatLab licenses server ('''example''' for matlab-inria-grenoble, see the table above for your group):
  (node):~$ export LM_LICENSE_FILE=29030@jetons.inria.fr
* Load the Matlab module named ''matlab'' (with version accordingly to your provided license), for example:
  (node):~$ module load matlab/R2022b
* Start MatLab without graphical environment:
  (node):~$ matlab -nodisplay -nodesktop -nosplash
 
{{Warning|text=As of today, this is only working when a node is entirely reserved in your job (all CPU cores). Please see the note below for an alternative.}}
{{Note|text=If you have access to other licenses from your workstation, you can also make a ssh port redirection. For example, if you have access to the  Université de Lorraine license, you can:
* Connect '''from your workstation''' to '''your previously reserved node''' with a ssh port redirection.
  (workstation):~$ ssh -R 27000:flexlm1.univ-lorraine.fr:27000 -R 27001:flexlm1.univ-lorraine.fr:27001 <node>.<site>.g5k
* Set up the link to your MatLab License server through your ssh connection:
  (node):~$ export LM_LICENSE_FILE=27000@127.0.0.1
* Load the MatLab module and start it (see above)
}}
 
'''Note:''' No MatLab License server is needed to use MatLab Runtime on nodes.
 
=== DDT/Map example ===
 
In the following example, we are using the Inria license server to run ddt because, currently, the DDT/Map module is supported only for Inria members.
* Make sure you are a member of the '''ddt-inria-all''' in the [https://api.grid5000.fr/stable/users/#myaccount '''User Management Service''']. If not, ask to join the group.
* Connect '''from your workstation''' to '''your previously reserved node'''.
  (workstation) % ssh -YC <node>.<site>.g5k
* Load the Matlab module named ''ddt'' (with version accordingly to your provided license), for example:
  (node):~$ module load ddt
* Start DDT using the GUI:
  (node):~$ ddt
 
{{Note|text=If you want to specify other licenses, you can place them in ~/ddt/Licence.client.ddt and ~/ddt/Licence.client.map}}
 
== Sharing modules between users ==


If it is not available from your shell (it might be the case if you use zsh), ensure that <code>/etc/profile.d/lmod.sh</code> is sourced.
Modules can be shared between users by using a group storage.


In addition, ''module'' must be executed in an actual shell to work: a simple <code class="command">oarsub "module load gcc"</code> will fail, you must use either:
{{Note|text=This documentation assume that you already know what is a group storage, and that you already have one for storing your modules. If it's not the case, check the '''[[Group_Storage | Group Storage Documentation]]'''. }}


{{Term|location=fnancy|cmd=<code class="command">oarsub</code> <code class="command">'bash -l -c "module load gcc; gcc --version"'</code>}}
Put the modules in a dedicated folder of the group storage, so that the modules are reachable by any user belonging to the group storage.


or alternatively:
{{Term|location=access|cmd=<code class="command">rsync</code> <code class="replace">modules-dir/</code> <code>/srv/</code><code class="replace">my-storage</code><code>@</code><code class="replace">storage_fqdn/modules-dir</code>}}


{{Term|location=fnancy|cmd=<code class="command">oarsub</code> <code class="command">". /etc/profile; module load gcc; gcc --version"</code>}}
After adding the directory to the <code>MODULEPATH</code> environment variable, the modules inside this directory will be loadable by other users of the group.
This can be done by using one of the two following commands.
{{Term|location=node|cmd=<code class="command">module use</code> <code>/srv/</code><code class="replace">my-storage</code><code>@</code><code class="replace">storage_fqdn/modules-dir</code>}}
or
{{Term|location=node|cmd=<code class="command">export MODULEPATH=</code><code>/srv/</code><code class="replace">my-storage</code><code>@</code><code class="replace">storage_fqdn/modules-dir</code><code>:$MODULEPATH</code>}}


== Packages that require connection to license server ==


Some packages available with module are licensed and require a connection to a license server (e.g. the Intel tools). If your institution provides a license server for this software, you may forward the connection to that license server from the node where the software will be used.
Finally, the modules can be loaded.
{{Term|location=node|cmd=<code class="command">module load</code> <code class="replace">my-personal-module</code>}}


For instance, to use Intel compilers from an Inria institute network (or when using Inria's VPN), you can use jetons.inria.fr license server by forwarding connections to ports 29030 and 34430 from your node using an SSH tunnel. Intel compilers must be configured to use ''localhost:29030'' as the license server for connections to be forwarded in the tunnel. For instance, use the following commands:


# Assuming that your connecting from a network where jetons.inria.fr is available
If you are not familiar with modules, you will find [https://gitlab.inria.fr/grid5000/team-modules here] a sample setup to build and share a set of modules of your own.
laptop: ssh -R 29030:jetons.inria.fr:29030 -R 34430:jetons.inria.fr:34430 <your_node>.g5k
node: module load intel-parallel-studio
node: export INTEL_LICENSE_FILE=29030@127.0.0.1
node: icc -v

Latest revision as of 16:17, 28 November 2024

Note.png Note

This page is actively maintained by the Grid'5000 team. If you encounter problems, please report them (see the Support page). Additionally, as it is a wiki page, you are free to make minor corrections yourself if needed. If you would like to suggest a more fundamental change, please contact the Grid'5000 team.

Grid'5000 provides a set of software (mainly scientific-related) using Lmod, thanks to the module command line tool.

They are available from Grid5000 frontends or cluster's nodes (only on debianXX-big and debianXX-nfs environment if deployment is used).

The modules command

General usage

The module system is designed to load software and make it available by modifying environment (such as your PATH variable).

To get started, list available software using:

 $ module av
 
 ----------------------------------------- /grid5000/spack/v1/share/spack/modules/linux-debian11-x86_64_v2 -----------------------------------------------------------
  apptainer/1.0.2_gcc-10.4.0                           intel-oneapi-ipp/2021.6.0_gcc-10.4.0                 py-numpy/1.23.3_gcc-10.4.0-intelmpi-python-3.9.13
  cmake/3.23.3_gcc-10.4.0                              intel-oneapi-ipp/2021.7.0_gcc-10.4.0          (D)    py-pyopencl/2020.2.2_gcc-10.4.0-intelmpi-python-3.9.13
  comgr/5.2.0_gcc-10.4.0                               intel-oneapi-mkl/2022.1.0_gcc-10.4.0-intelmpi        python/3.9.13_gcc-10.4.0
  cube/4.6_gcc-10.4.0                                  intel-oneapi-mkl/2022.1.0_gcc-10.4.0-openmpi         rdma-core/41.0_gcc-10.4.0
  cuda/11.4.0_gcc-10.4.0                               intel-oneapi-mkl/2023.0.0_gcc-10.4.0-intelmpi (D)    rocblas/5.2.0_gcc-10.4.0
  cuda/11.6.2_gcc-10.4.0                               intel-oneapi-mpi/2021.6.0_gcc-10.4.0                 rocm-cmake/5.2.0_gcc-10.4.0
  cuda/11.7.1_gcc-10.4.0                        (D)    intel-oneapi-mpi/2021.8.0_gcc-10.4.0          (D)    rocm-opencl/5.2.0_gcc-10.4.0
  cudnn/8.2.4.15-11.4_gcc-10.4.0                       intel-oneapi-tbb/2021.8.0_gcc-10.4.0                 rocm-openmp-extras/5.2.0_gcc-10.4.0
  cudnn/8.4.0.27-11.6_gcc-10.4.0                (D)    intel-oneapi-vtune/2022.3.0_gcc-10.4.0               rocm-smi-lib/5.2.3_gcc-10.4.0
  dbus/1.13.6_gcc-10.4.0                               intel-oneapi-vtune/2023.0.0_gcc-10.4.0        (D)    rocminfo/5.2.0_gcc-10.4.0
  gcc/10.4.0_gcc-10.4.0                                julia/1.8.2_gcc-10.4.0                               rocprofiler-dev/5.2.0_gcc-10.4.0
  gcc/12.2.0_gcc-10.4.0                         (D)    libfabric/1.15.1_gcc-10.4.0                          rocsolver/5.2.0_gcc-10.4.0
  glpk/4.65_gcc-10.4.0                                 llvm/13.0.1_gcc-10.4.0                               roctracer-dev-api/5.2.0_gcc-10.4.0
  go/1.18_gcc-10.4.0                                   llvm-amdgpu/5.2.0_gcc-10.4.0                         rust/1.65.0_gcc-10.4.0
  hip/5.2.0_gcc-10.4.0                                 metis/5.1.0_gcc-10.4.0                               scalasca/2.6_gcc-10.4.0-openmpi
  hipblas/5.2.0_gcc-10.4.0                             mpich/4.0.2_gcc-10.4.0-cuda                          scorep/7.1_gcc-10.4.0-openmpi
  hsa-rocr-dev/5.2.0_gcc-10.4.0                        mpich/4.0.2_gcc-10.4.0                        (D)    scotch/7.0.1_gcc-10.4.0-intelmpi
  hsakmt-roct/5.2.0_gcc-10.4.0                         mumps/5.4.1_gcc-10.4.0-intelmpi                      singularity/3.8.5_gcc-10.4.0
  intel-oneapi-advisor/2022.1.0_gcc-10.4.0             mvapich2/2.3.7_gcc-10.4.0                            singularity/3.8.7_gcc-10.4.0                           (D)
  intel-oneapi-advisor/2023.0.0_gcc-10.4.0      (D)    netlib-lapack/3.10.1_gcc-10.4.0                      starpu/1.3.9_gcc-10.4.0
  intel-oneapi-ccl/2021.6.0_gcc-10.4.0-intelmpi        netlib-scalapack/2.2.0_gcc-10.4.0-openmpi            ucx/1.13.1_gcc-10.4.0-compat
  intel-oneapi-ccl/2021.8.0_gcc-10.4.0-intelmpi (D)    opa-psm2/11.2.230_gcc-10.4.0                         ucx/1.13.1_gcc-10.4.0                                  (D)
  intel-oneapi-compilers/2022.1.0_gcc-10.4.0           openblas/0.3.20_gcc-10.4.0                           valgrind/3.19.0_gcc-10.4.0-intelmpi
  intel-oneapi-compilers/2023.0.0_gcc-10.4.0    (D)    openjdk/1.8.0_265-b01_gcc-10.4.0                     valgrind/3.19.0_gcc-10.4.0-openmpi
  intel-oneapi-dpl/2021.7.0_gcc-10.4.0                 openmpi/4.1.4_gcc-10.4.0                             valgrind/3.20.0_gcc-10.4.0-intelmpi                    (D)
  intel-oneapi-dpl/2022.0.0_gcc-10.4.0          (D)    openmpi/4.1.5_gcc-10.4.0                      (D)    vtk/9.0.3_gcc-10.4.0-intelmpi
  intel-oneapi-inspector/2022.1.0_gcc-10.4.0           petsc/3.17.4_gcc-10.4.0-intelmpi                     vtk/9.0.3_gcc-10.4.0-openmpi                           (D)
  intel-oneapi-inspector/2023.0.0_gcc-10.4.0    (D)    pmix/4.1.2_gcc-10.4.0
 
 ----------------------------------------- /grid5000/spack/modules-others/modules --------------------------------------------------------------------------------------
  conda/23.3.1    conda/23.5.0 (D)    ddt/20.1.2    matlab/R2022a    matlab/R2022b (D)    matlab-runtime/R2022a    matlab-runtime/R2022b (D)
 
   Where:
    D:  Default Module
 
 Use "module spider" to find all possible modules.
 Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".


Note.png Note

The complete list of modules with quick description and installed versions is available on the page Modules List.
If you need additional software to be installed, feel free to contact Grid5000 support team and we can look into it.


To load something into your environment use the load command:

$ module load gcc
$ gcc --version
gcc (Spack GCC) 12.2.0

By default, module loads the latest version available of a software (sorted by lexicographical order). You can also specify the version you need:

$ module load gcc/10.4.0_gcc-10.4.0
$ gcc --version
gcc (Spack GCC) 10.4.0

You can also find out more information about the package using whatis or show commands:

$ module whatis gcc
$ module show gcc

If you want to unload one or all the currently loaded modules, you can use:

$ module unload gcc
$ module purge

The full documentation of module command is available at: http://modules.sourceforge.net/

Modules that require connection to license server

Some modules are licensed and require a connection to a license server. We currently propose two license protected modules, Matlab and DDT/Map, with easier access to some institutional license servers. To use them, you just need to join the correct group from the User Management Service.

Group Software Server Usage Allowed users
matlab-inria-sophia matlab matlab-sam.inria.fr export LM_LICENSE_FILE=29030@jetons.inria.fr MatLab access for members of teams and services of Centre Inria d'Université Côte d'Azur
matlab-inria-rennes matlab licence.irisa.fr export LM_LICENSE_FILE=1731@licence.irisa.fr MatLab access for members of teams and services of either Centre Inria de l'Université de Rennes or UMR IRISA
matlab-inria-nancy matlab flexlm1.univ-lorraine.fr export LM_LICENSE_FILE=27000@flexlm1.univ-lorraine.fr MatLab access for members of teams and services of Centre Inria de l’Université de Lorraine
matlab-inria-lyon matlab jetons.inria.fr export LM_LICENSE_FILE=29030@jetons.inria.fr MatLab access for members of teams and services of Centre Inria de Lyon
matlab-inria-grenoble matlab jetons.inria.fr export LM_LICENSE_FILE=29030@jetons.inria.fr MatLab access for members of teams and services of Centre Inria de l'Université Grenoble Alpes
ddt-inria-all ddt jetons.inria.fr - Access to DDT for members of Inria teams and services

Last generated from the Grid'5000 API on 2023-11-13

We plan to add other institutional license servers on this list.


MatLab example

  • Make sure you are a member of a MatLab group in the User Management Service. If not, ask to join the group.
  • Set up the link to the MatLab licenses server (example for matlab-inria-grenoble, see the table above for your group):
 (node):~$ export LM_LICENSE_FILE=29030@jetons.inria.fr
  • Load the Matlab module named matlab (with version accordingly to your provided license), for example:
 (node):~$ module load matlab/R2022b
  • Start MatLab without graphical environment:
 (node):~$ matlab -nodisplay -nodesktop -nosplash
Warning.png Warning

As of today, this is only working when a node is entirely reserved in your job (all CPU cores). Please see the note below for an alternative.

Note.png Note

If you have access to other licenses from your workstation, you can also make a ssh port redirection. For example, if you have access to the Université de Lorraine license, you can:

  • Connect from your workstation to your previously reserved node with a ssh port redirection.
 (workstation):~$ ssh -R 27000:flexlm1.univ-lorraine.fr:27000 -R 27001:flexlm1.univ-lorraine.fr:27001 <node>.<site>.g5k
  • Set up the link to your MatLab License server through your ssh connection:
 (node):~$ export LM_LICENSE_FILE=27000@127.0.0.1
  • Load the MatLab module and start it (see above)

Note: No MatLab License server is needed to use MatLab Runtime on nodes.

DDT/Map example

In the following example, we are using the Inria license server to run ddt because, currently, the DDT/Map module is supported only for Inria members.

  • Make sure you are a member of the ddt-inria-all in the User Management Service. If not, ask to join the group.
  • Connect from your workstation to your previously reserved node.
 (workstation) % ssh -YC <node>.<site>.g5k
  • Load the Matlab module named ddt (with version accordingly to your provided license), for example:
 (node):~$ module load ddt
  • Start DDT using the GUI:
 (node):~$ ddt
Note.png Note

If you want to specify other licenses, you can place them in ~/ddt/Licence.client.ddt and ~/ddt/Licence.client.map

Sharing modules between users

Modules can be shared between users by using a group storage.

Note.png Note

This documentation assume that you already know what is a group storage, and that you already have one for storing your modules. If it's not the case, check the Group Storage Documentation.

Put the modules in a dedicated folder of the group storage, so that the modules are reachable by any user belonging to the group storage.

Terminal.png access:
rsync modules-dir/ /srv/my-storage@storage_fqdn/modules-dir

After adding the directory to the MODULEPATH environment variable, the modules inside this directory will be loadable by other users of the group. This can be done by using one of the two following commands.

Terminal.png node:
module use /srv/my-storage@storage_fqdn/modules-dir

or

Terminal.png node:
export MODULEPATH=/srv/my-storage@storage_fqdn/modules-dir:$MODULEPATH


Finally, the modules can be loaded.

Terminal.png node:
module load my-personal-module


If you are not familiar with modules, you will find here a sample setup to build and share a set of modules of your own.