Part 3: Use and configure a nf-neuro module
In this part, we’ll add a nf-neuro
module for processing DWI images,
and use a pre-installed module to computes diffusion tensor imaging (DTI) metrics.
To do this, we will go through 8 successive sub-steps.
1. Include the module in your main.nf
Modify your main.nf
file and insert the include {}
command at the top. This line adds
the module RECONST_DTIMETRICS
to your project.
#!/usr/bin/env nextflow
#!/usr/bin/env nextflow
include { RECONST_DTIMETRICS } from './modules/nf-neuro/reconst/dtimetrics/main'
2. Use the module in the workflow
After importing the module, you can then use it in your workflow as follows:
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() data.dwi.view() // Contains your DWI data: [meta, dwi, bval, bvec]}
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() RECONST_DTIMETRICS( data.dwi )}
3. Understand input data specific to the module.
Before using the RECONST_DTIMETRICS
module, it’s essential to understand the file types it expects as input.
For this, please refer to the API Documentation.
The Inputs
section module shows that 4 input files are required (excluding meta):
Mandatory: dwi
, bval
, bvec
Optional : b0mask
We’re now going to prepare the input data using Nextflow’s channel operators map()
.
In our case, the channel inputs
already contains the data dwi
, bval
and bvec
.
Since the mask is optional, we can handle it by appending an empty list.
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() RECONST_DTIMETRICS( data.dwi )}
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() input_dti_metric = data.dwi.map{ it + [[]] } RECONST_DTIMETRICS( input_dti_metric )}
4. Validate Input Data
To ensure that the new input_dti_metric
channel is correctly structured,
comment the module (using //
) and use the .view()
operator, which will display the results directly in the terminal, very useful for debugging.
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() input_dti_metric = data.dwi.map{ it + [[]] } RECONST_DTIMETRICS( input_dti_metric )}
workflow { // ** Now call your input workflow to fetch your files ** // data = get_data() input_dti_metric = data.dwi.map{ it + [[]] } input_dti_metric.view() //RECONST_DTIMETRICS( input_dti_metric )}
Now, you can run nextflow..
nextflow run main.nf --input data -profile docker
[[id:sub-003_ses-01], /workspaces/nf-neuro-tutorial_test/data/sub-003/ses-01/dwi/sub-003_ses-01_dir-AP_dwi.nii.gz, /workspaces/nf-neuro-tutorial_test/data/sub-003/ses-01/dwi/sub-003_ses-01_dir-AP_dwi.bval, /workspaces/nf-neuro-tutorial_test/data/sub-003/ses-01/dwi/sub-003_ses-01_dir-AP_dwi.bvec, []]
You have now configured and checked that the inputs respect the RECONST_DTIMETRICS
module’s expectations,
taking into account the management of an optional file. The next step is to configure the module parameters.
5. Configure the module
Define module parameters
Each module may require specific parameters.
To find the required parameters and their default values, check the Arguments
section of the module supplied by
the API Documentation.
In the nextflow.config file, you will have to set these parameters using the process selector (withName
) that
links the ext.
parameter to the params.
parameter.
process { withName: 'YOUR_MODULE' { ext.option1 = params.option1 ext.args1 = boolean/value/str }}
The RECONST_DTIMETRICS
module requires a set of parameters to be added to the
nextflow.config
. Please copy and paste it into your nextflow.config after the manifest
part.
process { withName: "RECONST_DTIMETRICS" { ext.ad = false ext.evecs = false ext.evals = false ext.fa = true ext.ga = false ext.rgb = false ext.md = true ext.mode = false ext.norm = false ext.rd = false ext.tensor = false ext.nonphysical = false ext.pulsation = false ext.residual = false ext.b0_thr_extract_b0 = 10 ext.dwi_shell_tolerance = 50 ext.max_dti_shell_value = 1200 ext.run_qc = false }}
Fetching outputs from the modules
Last but not least, you now have a working main.nf
file. You could run the pipeline,
but the output would be hard to access. To ensure easy access to results, define an
output directory in nextflow.config
using the publishDir
:
process { withName: "RECONST_DTIMETRICS" {
process { publishDir = { "${params.output}/$meta.id/${task.process.replaceAll(':', '-')}" } withName: "RECONST_DTIMETRICS" {
6. Verify your files
That’s it! Your nextflow.config
should look something like this:
profiles { docker { docker.enabled = true conda.enabled = false singularity.enabled = false podman.enabled = false shifter.enabled = false charliecloud.enabled = false apptainer.enabled = false docker.runOptions = '-u $(id -u):$(id -g)' }}
manifest { name = 'scilus/nf-neuro-tutorial' description = """nf-neuro-tutorial is a Nextflow pipeline for processing neuroimaging data.""" version = '0.1dev'}
params.input = falseparams.output = 'result'
process {
publishDir = { "${params.output}/$meta.id/${task.process.replaceAll(':', '-')}" }
withName: "RECONST_DTIMETRICS" { ext.ad = false ext.evecs = false ext.evals = false ext.fa = true ext.ga = false ext.rgb = false ext.md = true ext.mode = false ext.norm = false ext.rd = false ext.tensor = false ext.nonphysical = false ext.pulsation = false ext.residual = false ext.b0_thr_extract_b0 = 10 ext.dwi_shell_tolerance = 50 ext.max_dti_shell_value = 1200 ext.run_qc = false }}
7. Run nextflow
Now, you can run nextflow..
nextflow run main.nf --input data -profile docker
You should see this output:
executor > local (1) executor > local (1) [a8/7a8x13] process > RECONST_DTIMETRICS (sub-003_ses-01) [100%] 1 of 1 ✔
8. Visualize data in result folder
You can check the module’s output files with the following command, or use the VSCode interface to display FA and MD images via the NiiVue extension (pre-installed).
ls ./result/sub-003_ses-01/RECONST_DTIMETRICS/
You should see this output:
sub-003_ses-01__fa.nii.gz sub-003_ses-01__md.nii.gz versions.yml