Skip to content

Part 6: Create your own local module

In this part, we will create a local module called METRICSINROI under the STATS category, designed for segmenting a T1-weighted (T1w) image and extracting metrics from white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF) masks.

1. Create a local module structure

Create the following directory structure, including a main.nf file inside the module folder:

stats = category and metricsinroi = module name

Create the following local directory structure, including a main.nf file inside the subworkflow folder.

Terminal window
mkdir -p /workspaces/nf-neuro-tutorial/modules/local/stats/metricsinroi
touch /workspaces/nf-neuro-tutorial/modules/local/stats/metricsinroi/main.nf

You should get this structure.

  • Directorynf-neuro-tutorial
    • Directorymodules
      • Directorylocal
        • Directorystats
          • Directorymetricsinroi
            • main.nf
      • Directorynf-neuro
        • Directorymodules_categories
          • Directorymodule_name/

2. Explore the structure of a module

A Nextflow module is a file containing one process that execute one or more command.

The structure of a module should look similar to this:

process CATEGORY_MODULE {
tag "$meta.id"
label 'process_single'
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://scil.usherbrooke.ca/containers/scilus_latest.sif':
'scilus/scilus:latest' }"
input:
tuple val(meta), path(input_file)
output:
tuple val(meta), path("output.txt") , emit: outputname
script:
def prefix = task.ext.prefix ?: "${meta.id}"
def optionName = task.ext.option ? "${task.ext.option}" : "default_value"
"""
cat ${input_file} > output.txt
"""
}

Here it executes a simple command to copy the contents of the input file to output.txt. In summary, this process takes an input file, copies its contents to an output file, and provides some flexibility in naming and options through the use of metadata and task extensions. The process can run in either a Singularity or Docker container, depending on the workflow configuration.

3. Write the local module

The goal of this local module is to segment a T1-weighted (T1w) image using fast and extract metrics from the white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF) masks using the scil_volume_stats_in_ROI.py script from the scilpy toolbox.

You can copy and paste the example module into your main.nf file. First, rename the process STATS_METRICSINROI and then modify it using the steps below:

  1. Modify input to include T1 and metrics

    tuple val(meta), path(t1), path(metrics)
  2. Update output to include segmentation maps and masks, as well as the JSON output from the scilpy script.

    output:
    tuple val(meta), path("*.json") , emit: stats
    tuple val(meta), path("*mask_wm.nii.gz") , emit: wm_mask
    tuple val(meta), path("*map_wm.nii.gz") , emit: wm_map
  3. Include task extensions for additional options (those from the scilpy script)

    script:
    def prefix = task.ext.prefix ?: "${meta.id}"
    def bin = task.ext.bin ? "--bin " : ""
    def normalize_weights = task.ext.normalize_weights ? "--normalize_weights " : ""
  4. Modify the script

    1. Update the script section to perform segmentation using fast on the T1w image.
    2. Generate binary masks for each PVE tissue segmentation using scil_volume_math.py or another tool.
    3. Rename PVE outputs from fast to map_\*.nii.gz and mask_*.nii.gz.
    4. Extract metrics from the WM, GM and CSF binary masks using scil_volume_stats_in_ROI.py or other.


Expected ./modules/local/stats/metricsinroi/main.nf:

Terminal window
process STATS_METRICSINROI {
tag "$meta.id"
label 'process_single'
container "${workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://scil.usherbrooke.ca/containers/scilus_latest.sif':
'scilus/scilus:latest'}"
input:
tuple val(meta), path(t1), path(metrics)
output:
tuple val(meta), path("*.json") , emit: stats
tuple val(meta), path("*mask_wm.nii.gz") , emit: wm_mask
tuple val(meta), path("*mask_gm.nii.gz") , emit: gm_mask
tuple val(meta), path("*mask_csf.nii.gz") , emit: csf_mask
tuple val(meta), path("*map_wm.nii.gz") , emit: wm_map
tuple val(meta), path("*map_gm.nii.gz") , emit: gm_map
tuple val(meta), path("*map_csf.nii.gz") , emit: csf_map
script:
def prefix = task.ext.prefix ?: "${meta.id}"
def bin = task.ext.bin ? "--bin " : ""
def normalize_weights = task.ext.normalize_weights ? "--normalize_weights " : ""
"""
export ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS=1
export OMP_NUM_THREADS=1
export OPENBLAS_NUM_THREADS=1
fast -t 1 -n 3\
-H 0.1 -I 4 -l 20.0 -g -o t1.nii.gz $t1
scil_volume_math.py convert t1_seg_2.nii.gz ${prefix}__mask_wm.nii.gz --data_type uint8
scil_volume_math.py convert t1_seg_1.nii.gz ${prefix}__mask_gm.nii.gz --data_type uint8
scil_volume_math.py convert t1_seg_0.nii.gz ${prefix}__mask_csf.nii.gz --data_type uint8
mv t1_pve_2.nii.gz ${prefix}__map_wm.nii.gz
mv t1_pve_1.nii.gz ${prefix}__map_gm.nii.gz
mv t1_pve_0.nii.gz ${prefix}__map_csf.nii.gz
scil_volume_stats_in_ROI.py ${prefix}__mask*.nii.gz \
--metrics $metrics \
--sort_keys \
$bin $normalize_weights > ${prefix}__stats.json
"""
}

4. Prepare the input structure for the module and include it in your main.nf

  1. Include your local module

    You can now include and use your local module to the workflow, as shown in the previous steps.

    Add this line on the top of the `main.nf.

    include { STATS_METRICSINROI } from './modules/local/stats/metricsinroi/main'

    Add the local module in the workflow.

    STATS_METRICSINROI( input_channel )
  2. Prepare the input channel for your local module

    The STATS_METRICSINROI module requires two inputs: a T1 image and a list of metrics.

    You can use the join() and map() operators to create an input channel that combines the T1 BET image output from the PREPROC_T1 subworkflow with the FA map from the DTI module.

    Add the input channel to the module.

    Expected input channel:

    input_extract_metric = PREPROC_T1.out.image_bet
    .join(RECONST_DTIMETRICS.out.fa)
    .map{ it }
    STATS_METRICSINROI( input_extract_metric )

5. Configure your local module

To configure your local module, you need to add each task.ext.* to the module. As we’ve seen previously, use the withName process selector to link the ext parameter defined in your local module to nextflow.config file.

Add these lines to your ./nextflow.config

withName: "STATS_METRICSINROI" {
ext.bin = true
ext.normalize_weights = false
}

6. Verify your files

You now have a working local module in your Nextflow pipeline!

#!/usr/bin/env nextflow
include { RECONST_DTIMETRICS } from './modules/nf-neuro/reconst/dtimetrics/main'
include { DENOISING_MPPCA } from './modules/nf-neuro/denoising/mppca/main'
include { PREPROC_T1 } from './subworkflows/nf-neuro/preproc_t1/main'
include { STATS_METRICSINROI } from './modules/local/stats/metricsinroi/main'
workflow get_data {
main:
if ( !params.input ) {
log.info "You must provide an input directory containing all images using:"
log.info ""
log.info " --input=/path/to/[input] Input directory containing your subjects"
log.info " |"
log.info " ├-- S1"
log.info " | ├-- *dwi.nii.gz"
log.info " | ├-- *dwi.bval"
log.info " | ├-- *dwi.bvec"
log.info " | └-- *t1.nii.gz"
log.info " └-- S2"
log.info " ├-- *dwi.nii.gz"
log.info " ├-- *bval"
log.info " ├-- *bvec"
log.info " └-- *t1.nii.gz"
log.info ""
error "Please resubmit your command with the previous file structure."
}
input = file(params.input)
// ** Loading DWI files. ** //
dwi_channel = Channel.fromFilePairs("$input/**/**/dwi/*dwi.{nii.gz,bval,bvec}", size: 3, flat: true)
{ it.parent.parent.parent.name + "_" + it.parent.parent.name} // Set the subject filename as subjectID + '_' + session.
.map{ sid, bvals, bvecs, dwi -> [ [id: sid], dwi, bvals, bvecs ] } // Reordering the inputs.
// ** Loading T1 file. ** //
t1_channel = Channel.fromFilePairs("$input/**/**/anat/*T1w.nii.gz", size: 1, flat: true)
{ it.parent.parent.parent.name + "_" + it.parent.parent.name } // Set the subject filename as subjectID + '_' + session.
.map{ sid, t1 -> [ [id: sid], t1 ] }
emit:
dwi = dwi_channel
anat = t1_channel
}
workflow {
inputs = get_data()
// Use Multimap to split the tuple into multi inputs structure
ch_dwi_bvalbvec = inputs.dwi
.multiMap { meta, dwi, bval, bvec ->
dwi: [ meta, dwi ]
bvs_files: [ meta, bval, bvec ]
dwi_bval_bvec: [ meta, dwi, bval, bvec ]
}
// Denoising DWI
input_dwi_denoise = ch_dwi_bvalbvec.dwi
.map{ it + [[]] }
DENOISING_MPPCA( input_dwi_denoise )
// Fetch specific output
ch_dwi_denoised = DENOISING_MPPCA.out.image
// Input DTI update with DWI denoised output
input_dti_denoised = ch_dwi_denoised
.join(ch_dwi_bvalbvec.bvs_files)
.map{ it + [[]] }
// DTI-derived metrics
RECONST_DTIMETRICS( input_dti_denoised )
// Preprocessing T1 images
//inputs.anat.view()
PREPROC_T1(
inputs.anat,
Channel.empty(),
Channel.empty(),
Channel.empty(),
Channel.empty(),
Channel.empty(),
Channel.empty()
)
// Extract FA value
input_extract_metric = PREPROC_T1.out.image_bet
.join(RECONST_DTIMETRICS.out.fa)
.map{ it }
//input_extract_metric.view()
STATS_METRICSINROI( input_extract_metric )
}

7. Run nextflow

Now, you can run nextflow..

Terminal window
nextflow run main.nf --input data -profile docker -resume