Running nf-pediatric without internet access
Some computing nodes does not have access to internet at runtime. Since the pipeline interacts with the containers repository and pull during execution, it won’t work if the compute nodes do not have access to the internet. Fortunately, containers can be downloaded prior to the pipeline execution, and fetch locally during runtime. Here are a few steps to perform to get all of that working quickly:
-
Install
nf-core
toolsnf-core
tools is a nice Python package that provides utilities to interact with pipelines, such as downloading required containers destined for execution in offline settings. To install the package, please refer to the officialnf-core
documentation. -
Validate your installation
Terminal window nf-core --versionTerminal window ,--./,-.___ __ __ __ ___ /,-._.--~\|\ | |__ __ / ` / \ |__) |__ } {| \| | \__, \__/ | \ |___ \`-._,-`-,`._,._,'nf-core/tools version 3.3.2 - https://nf-co.renf-core, version 3.3.2 -
Set environment variables for container download
Prior to downloading the containers, we need to set some environment variables regarding how singularity/apptainer will handle cache location. Additionally, we need to set a
nextflow
variable that will tell in which directory we want to place or final containers. This can be done using the following four commands (we use both singularity and apptainer to cover all possible use cases):Terminal window export SINGULARITY_CACHEDIR=<path/to/cache/location>export APPTAINER_CACHEDIR=<path/to/cache/location>export NXF_SINGULARITY_CACHEDIR=<path/to/download/location>export NXF_APPTAINER_CACHEDIR=<path/to/download/location> -
Run the
nf-core pipelines download
To see the command options, you can use the
nf-core pipelines download -h
command that will list all possible options. However, we suggest you use those parameters (-r
is the pipeline version you want to download, and-o
is the output directory):Terminal window nf-core pipelines download scilus/nf-pediatric -r 0.1.0 -o ./containers/ -s singularity -l docker.io -u copy --forceTerminal window ,--./,-.___ __ __ __ ___ /,-._.--~\|\ | |__ __ / ` / \ |__) |__ } {| \| | \__, \__/ | \ |___ \`-._,-`-,`._,._,'nf-core/tools version 3.4.0.dev0 - https://nf-co.reWARNING Could not find GitHub authentication token. Some API requests may fail.INFO Detected Nextflow version 25.04.6If transferring the downloaded files to another system, it can be convenient to have everything compressed in a single file.This is not recommended when downloading Singularity images, as it can take a long time and saves very little space.? Choose compression type: noneWARNING Deleting existing output directory: 'containers'INFO Saving 'scilus/nf-pediatric'Pipeline revision: 'dev'Use containers: 'singularity'Container library: 'docker.io'Using NXF_SINGULARITY_CACHEDIR': /scratch/agagnon/download/containers/'Output directory: 'containers'Include default institutional configuration: 'False'INFO Downloading workflowINFO Downloading workflow files from GitHub
Reusing the downloaded containers during the pipeline execution
Section titled “Reusing the downloaded containers during the pipeline execution”Once containers are downloaded, nextflow
needs to be told where to look for those containers. If not already set, you need to reset the environment variables described above.
export NXF_SINGULARITY_CACHEDIR=<path/to/download/location>export NXF_APPTAINER_CACHEDIR=<path/to/download/location>
Now you are good to go! Please refer to the usage documentation on how to launch the pipeline.
Downloading TemplateFlow templates
Section titled “Downloading TemplateFlow templates”nf-pediatric
allows users to select a specific output space in which all subject will be registered. Those templates are fetch using the TemplateFlow Archive (Ciric et al, 2022). While downloading is directly handled in nf-pediatric
when internet is available, for offline use, the selected template (using the --template
) needs to be available. You can refer to the official TemplateFlow documentation for information on how to download the archive using either datalad
or the Python client.