Strumenti Utente

Strumenti Sito


calcoloscientifico:userguide:apptainer

Differenze

Queste sono le differenze tra la revisione selezionata e la versione attuale della pagina.

Link a questa pagina di confronto

Entrambe le parti precedenti la revisioneRevisione precedente
Prossima revisione
Revisione precedente
calcoloscientifico:userguide:apptainer [02/05/2022 18:07] fabio.spatarocalcoloscientifico:userguide:apptainer [14/01/2025 13:00] (versione attuale) – [Apptainer] fabio.spataro
Linea 1: Linea 1:
-==== AppTainer ====+===== Apptainer =====
  
-[[https://apptainer.org|AppTainer]]+  * [[https://apptainer.org|Apptainer]] 
 +  * [[https://github.com/apptainer/apptainer|Apptainer on GitHub]]
  
-AppTainer is already available on the HPC cluster.+Apptainer is already available on the HPC cluster.
  
 Users are encouraged to use it on their system. Users are encouraged to use it on their system.
  
-=== Install a binary package ===+==== Install a binary package ====
  
-These instructions are intended for users who wish to install AppTainer on their Linux system.+These instructions are intended for users who wish to install Apptainer on their Linux system.
  
-On a **RedHat Enterpise Linux** system AppTainer can be installed by the administrator with the following command:+On a **RedHat Enterpise Linux** system Apptainer can be installed by the administrator with the following command:
  
 <code> <code>
-yum install https://github.com/apptainer/apptainer/releases/download/v1.0.1/apptainer-1.0.1-1.x86_64.rpm+yum install https://github.com/apptainer/apptainer/releases/download/v1.1.8/apptainer-1.1.8-1.x86_64.rpm
 </code> </code>
  
-On a **Debian/Ubuntu** system AppTainer can be installed by the administrator with the following command:+On a **Debian/Ubuntu** system Apptainer can be installed by the administrator with the following command:
  
 <code> <code>
-sudo apt install https://github.com/apptainer/apptainer/releases/download/v1.0.1/apptainer_1.0.1_amd64.deb+wget -qc https://github.com/apptainer/apptainer/releases/download/v1.1.8/apptainer_1.1.8_amd64.deb 
 +sudo dpkg -i apptainer_1.1.8_amd64.deb
 </code> </code>
  
-=== Extract NVIDIA driver ===+==== Extract NVIDIA driver ====
  
 The following commands The following commands
Linea 34: Linea 36:
 were executed on worker nodes with GPU by the administrator in a directory that contains the [[https://www.nvidia.it/Download/index.aspx|NVIDIA driver]] setup executable (for example ''NVIDIA-Linux-x86_64-510.47.03.run''). were executed on worker nodes with GPU by the administrator in a directory that contains the [[https://www.nvidia.it/Download/index.aspx|NVIDIA driver]] setup executable (for example ''NVIDIA-Linux-x86_64-510.47.03.run'').
  
-Users who want to install AppTainer on their Linux system will be able to change the NVIDIA driver version and the installation path:+Users who want to install Apptainer on their Linux system will be able to change the NVIDIA driver version and the installation path:
  
 <code> <code>
Linea 41: Linea 43:
 </code> </code>
  
-=== AppTainer on a worker node with CPU ===+==== Apptainer on a worker node with CPU ====
  
 On the login node of the HPC cluster run the following command: On the login node of the HPC cluster run the following command:
Linea 52: Linea 54:
  
 <code bash> <code bash>
-module load apptainer+module load apptainer/1.0
  
 container='/hpc/share/applications/amber/20/amber-20-cpu' container='/hpc/share/applications/amber/20/amber-20-cpu'
Linea 63: Linea 65:
 </code> </code>
  
-Inside the AppTainer container:+Inside the Apptainer container:
  
 <code> <code>
Linea 70: Linea 72:
 </code> </code>
  
-Users who want to try AppTainer on their system will have to copy the AppTainer image+Users who want to try Apptainer on their system will have to copy the Apptainer image
  
 <code> <code>
Linea 80: Linea 82:
 Users who have installed a binary package do not have to load the ''apptainer'' module. Users who have installed a binary package do not have to load the ''apptainer'' module.
  
-=== AppTainer on a worker node with GPU ===+=== Apptainer on a worker node with GPU ===
  
 On the login node of the HPC cluster run the following command: On the login node of the HPC cluster run the following command:
Linea 91: Linea 93:
  
 <code bash> <code bash>
-module load apptainer+module load apptainer/1.0
  
 container='/hpc/share/applications/amber/20/amber-20-gpu' container='/hpc/share/applications/amber/20/amber-20-gpu'
Linea 103: Linea 105:
 </code> </code>
  
-Inside the AppTainer container:+Inside the Apptainer container:
  
 <code> <code>
Linea 118: Linea 120:
 </code> </code>
  
-Users who want to try AppTainer on their system will have to copy the AppTainer image+Users who want to try Apptainer on their system will have to copy the Apptainer image
  
 <code> <code>
Linea 125: Linea 127:
  
 from the HPC cluster to their system and appropriately modify the value of the ''container'' and ''bind'' variables. from the HPC cluster to their system and appropriately modify the value of the ''container'' and ''bind'' variables.
 +
 Users who have installed a binary package do not have to load the ''apptainer'' module. Users who have installed a binary package do not have to load the ''apptainer'' module.
  
-----+===== Apptainer containers =====
  
-<note> +==== Bacterial Genomics software collection ====
-The following section needs to be revised by replacing **singularity** with **apptainer** +
-</note>+
  
-=== CityChrone ===+''slurm-bactgen.sh'' script to get list of packages present in ''bactgen'' environment and help from ''tormes'' on 1 node (1 task, 4 CPUs per task):
  
-[[http://project-osrm.org|CityChrone Project]] +<code bash slurm-bactgen.sh>
- +
-== CityChrone from Rocky 8.5 docker == +
- +
-Singularity Definition File: +
- +
-<code text citychrone-rocky-8.5.def> +
-BootStrap: docker +
-From: rockylinux:8.5 +
- +
-%environment +
-export PATH=/miniconda3/bin:$PATH +
- +
-%runscript +
-exec vcontact "$@" +
- +
-%post +
-dnf -y update +
-dnf -y install scl-utils +
-dnf -y install gcc-toolset-9 +
-scl enable gcc-toolset-9 bash +
-dnf -y install git cmake3 zlib-devel wget +
- +
-# Install miniconda +
-wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh +
-bash Miniconda3-latest-Linux-x86_64.sh -b -f -p /miniconda3/ +
-rm Miniconda3-latest-Linux-x86_64.sh +
- +
-# pull the conda functions in . /miniconda3/etc/profile.d/conda.sh and make pip, etc. available while in %post +
-export PATH="/miniconda3/bin:$PATH" +
- +
-# CONDA install +
-conda install \ +
---yes \ +
---channel conda-forge \ +
---strict-channel-priority \ +
-pandas matplotlib folium gdal jupyter numba colorama geopy shapely tqdm pymongo geojson protobuf pyproj +
- +
-# Help conda resolving Python "import" +
-conda update --all +
- +
-# PIP install +
-pip install \ +
---no-deps \ +
-gtfs-realtime-bindings pyspark mpire +
- +
-# OSRM +
-git clone https://github.com/Project-OSRM/osrm-backend.git +
-cd osrm-backend +
-mkdir -p build +
-cd build +
-cmake3 .. -DENABLE_MASON=ON -DCMAKE_CXX_COMPILER=/opt/rh/gcc-toolset-9/root/usr/bin/g++ +
-make +
-make install +
-</code> +
- +
-Singularity Build script: +
- +
-<code bash singularity-build-rocky-8.5.sh>+
 #!/bin/bash #!/bin/bash
 +#SBATCH --job-name=bactgen
 +#SBATCH --output=%x.o%j
 +#SBATCH --error=%x.e%j
 +#SBATCH --nodes=1
 +#SBATCH --ntasks-per-node=1
 +#SBATCH --cpus-per-task=4
 +#SBATCH --time=0-01:00:00
 +#SBATCH --mem=8G
 +#SBATCH --partition=cpu
 +#SBATCH --qos=cpu
 +#SBATCH --account=<account>
  
-module load singularity/3.8.6+module load apptainer 
 +module load bactgen
  
-export      SINGULARITY_CACHEDIR="/node/$USER/singularity/.singularity_cache" +apptainer run "$CONTAINERmicromamba list 
-export    SINGULARITY_PULLFOLDER="/node/$USER/singularity/.singularity_images" +echo '─────────────────────────────────────────────────────────────────────────────────────────
-export        SINGULARITY_TMPDIR="/node/$USER/singularity/.singularity_tmp" +apptainer run "$CONTAINERtormes --help 
-export SINGULARITY_LOCALCACHEDIR="/node/$USER/singularity/.singularity_localcache" +echo '─────────────────────────────────────────────────────────────────────────────────────────'
-export                    TMPDIR="/node/$USER/singularity/.tmp" +
- +
-mkdir -p "$SINGULARITY_CACHEDIR" +
-mkdir -p "$SINGULARITY_PULLFOLDER" +
-mkdir -p "$SINGULARITY_TMPDIR" +
-mkdir -p "$SINGULARITY_LOCALCACHEDIR" +
-mkdir -p "$TMPDIR" +
- +
-singularity build --fakeroot "/node/$USER/singularity/citychrone-rocky-8.5.sif" citychrone-rocky-8.5.def +
-mv "/node/$USER/singularity/citychrone-rocky-8.5.sif"+
-</code> +
- +
-Singularity Run script: +
- +
-<code bash singularity-run-rocky-8.5.sh> +
-#!/bin/bash +
- +
-module load singularity/3.8.6 +
- +
-singularity shell citychrone-rocky-8.5.sif +
-</code> +
- +
-Interactive session: +
- +
-<code bash> +
-srun --nodes=1 --ntasks-per-node=2 --partition=cpu --mem=8G --time=02:00:00 --pty bash +
-</code> +
- +
-Launch Singularity Build script: +
- +
-<code bash> +
-bash singularity-build-rocky-8.5.sh +
-</code> +
- +
-Launch Singularity Run script: +
- +
-<code bash> +
-bash singularity-run-rocky-8.5.sh +
-</code> +
- +
-== CityChrone from Ubuntu Focal (20.04 LTS) docker == +
- +
-Singularity Definition File: +
- +
-<code text citychrone-ubuntu-focal.def> +
-BootStrap: docker +
-From: ubuntu:focal +
- +
-%environment +
-export PATH=/miniconda3/bin:$PATH +
-export DEBIAN_FRONTEND=noninteractive +
-export TZ='Europe/Rome' +
- +
-%runscript +
-exec vcontact "$@+
- +
-%post +
-DEBIAN_FRONTEND=noninteractive +
-TZ='Europe/Rome+
-ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone +
-apt-get update && \ +
-apt-get install -y automake build-essential bzip2 wget git default-jre unzip \ +
-build-essential git cmake pkg-config \ +
-libbz2-dev libstxxl-dev libstxxl1v5 libxml2-dev \ +
-libzip-dev libboost-all-dev lua5.2 liblua5.2-dev libtbb-dev +
- +
-# Install miniconda +
-wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh +
-bash Miniconda3-latest-Linux-x86_64.sh -b -f -p /miniconda3/ +
-rm Miniconda3-latest-Linux-x86_64.sh +
- +
-# pull the conda functions in . /miniconda3/etc/profile.d/conda.sh and make pip, etc. available while in %post +
-export PATH="/miniconda3/bin:$PATH" +
- +
-# CONDA install +
-conda install \ +
---yes \ +
---channel conda-forge \ +
---strict-channel-priority \ +
-pandas matplotlib folium gdal jupyter numba colorama geopy shapely tqdm pymongo geojson protobuf pyproj +
- +
-# Help conda resolving Python "import" +
-conda update --all +
- +
-# PIP install +
-pip install \ +
---no-deps \ +
-gtfs-realtime-bindings pyspark mpire +
- +
-# OSRM +
-git clone https://github.com/Project-OSRM/osrm-backend.git +
-cd osrm-backend +
-mkdir -p build +
-cd build +
-cmake .. -DCMAKE_BUILD_TYPE=Release +
-cmake --build . +
-cmake --build . --target install +
-</code> +
- +
-Singularity Build script:  +
- +
-<code bash singularity-build-ubuntu-focal.sh> +
-#!/bin/bash +
- +
-module load singularity/3.8.6 +
- +
-export      SINGULARITY_CACHEDIR="/node/$USER/singularity/.singularity_cache" +
-export    SINGULARITY_PULLFOLDER="/node/$USER/singularity/.singularity_images" +
-export        SINGULARITY_TMPDIR="/node/$USER/singularity/.singularity_tmp" +
-export SINGULARITY_LOCALCACHEDIR="/node/$USER/singularity/.singularity_localcache" +
-export                    TMPDIR="/node/$USER/singularity/.tmp" +
- +
-mkdir -p "$SINGULARITY_CACHEDIR" +
-mkdir -p "$SINGULARITY_PULLFOLDER" +
-mkdir -p "$SINGULARITY_TMPDIR" +
-mkdir -p "$SINGULARITY_LOCALCACHEDIR" +
-mkdir -p "$TMPDIR" +
- +
-singularity build --fakeroot "/node/$USER/singularity/citychrone-ubuntu-focal.sif" citychrone-ubuntu-focal.def +
-mv "/node/$USER/singularity/citychrone-ubuntu-focal.sif"+
-</code> +
- +
-Singularity Run script: +
- +
-<code bash singularity-run-ubuntu-focal.sh> +
-#!/bin/bash +
- +
-module load singularity/3.8.6 +
- +
-singularity shell citychrone-ubuntu-focal.sif +
-</code> +
- +
-Interactive session: +
- +
-<code bash> +
-srun --nodes=1 --ntasks-per-node=2 --partition=cpu --mem=8G --time=02:00:00 --pty bash +
-</code> +
- +
-Launch Singularity Build script: +
- +
-<code bash> +
-bash singularity-build-ubuntu-focal.sh+
 </code> </code>
  
-Launch Singularity Run script:+Edit the ''slurm-bactgen.sh'' script and submit it with the following command:
  
 <code bash> <code bash>
-bash singularity-run-ubuntu-focal.sh+sbatch slurm-bactgen.sh
 </code> </code>
  
calcoloscientifico/userguide/apptainer.1651507633.txt.gz · Ultima modifica: 02/05/2022 18:07 da fabio.spataro

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki