Skip to content

Offline installation

Some computing nodes on HPC clusters does not have access to internet at runtime. Since the pipeline interacts with the containers repository and pull during execution, it won’t work if the compute nodes do not have access to the internet. Fortunately, containers can be downloaded prior to the pipeline execution, and fetch locally during runtime. Here are a few steps to perform to get all of that working quickly:

  1. (Recommended) Point the NXF_HOME environment variable in your work/scratch directory. When pulling or running a pipeline, the source files required to run the pipeline will be automatically cloned and fetched in the location pointed by the NXF_HOME variable.

    Terminal window
    export NXF_HOME=/scratch/${USER}/.nextflow
  2. Create a directory where to host your apptainer images and a cache directory. For example:

    Terminal window
    mkdir /scratch/${USER}/sf-tractomics-containers
    mkdir /scratch/${USER}/sf-tractomics-containers/cache

    Feel free to change the directories location to fit your needs. The following steps will use these paths as a reference.

  3. Download the apptainer images

    Download with script
    Terminal window
    curl -fsSL https://raw.githubusercontent.com/nf-neuro/modules/refs/heads/main/assets/download_pipeline.sh | bash -s -- \
    -p scilus/sf-tractomics \
    -r dev \
    -o /scratch/${USER}/sf-tractomics-containers \
    -c /scratch/${USER}/sf-tractomics-containers/cache \
    -d 4
    Download with nf-core (advanced)
    1. Install nf-core tools nf-core tools is a nice Python package that provides utilities to interact with pipelines, such as downloading required containers destined for execution in offline settings. We suggest installing nf-core within a python virtual environment:

      Terminal window
      python -m virtualenv .venv
      source .venv/bin/activate
      pip install "nf-core>=3.5.2"

      For more details on how to install the package, please refer to the official nf-core documentation.

    2. Validate your installation

      Terminal window
      nf-core --version
    3. Set environment variables to indicate to nf-core where to download the images:

      Terminal window
      export APPTAINER_CACHEDIR=/scratch/${USER}/sf-tractomics-containers
      export NXF_APPTAINER_CACHEDIR=/scratch/${USER}/sf-tractomics-containers/cache
      export SINGULARITY_CACHEDIR=${APPTAINER_CACHEDIR}
      export NXF_SINGULARITY_CACHEDIR=${NXF_APPTAINER_CACHEDIR}
    4. Download the images

      To see the command options, you can use the nf-core pipelines download -h command that will list all possible options. However, we suggest you use those parameters (-r is the pipeline version you want to download, and -o is the output directory):

      Terminal window
      nf-core pipelines download scilus/sf-tractomics -r 0.1.0 -o ./containers/ -s singularity -l docker.io -u copy --force

    Once the images are downloaded, nextflow needs to be told where to find those images in order to properly spawn the containers used in the pipeline. If not already set, you need to set the following environment variables (with the appropriate paths). Those variables need to be available when running the pipeline so that nextflow knows where to find the downloaded images and avoids trying to pull them from the web.

    Terminal window
    export APPTAINER_CACHEDIR=/scratch/${USER}/sf-tractomics-containers
    export NXF_APPTAINER_CACHEDIR=/scratch/${USER}/sf-tractomics-containers/cache
    export SINGULARITY_CACHEDIR=${APPTAINER_CACHEDIR}
    export NXF_SINGULARITY_CACHEDIR=${NXF_APPTAINER_CACHEDIR}

Troubleshooting

If you encountered any issues when following the steps detailed in this page, refer to the troubleshooting page or open an issue if your error is not covered (yet).