Skip to content

Create test cases

Adding tests to your module is a critical step to ensure it works correctly and continues to do so as the codebase evolves. The nf-core command automatically generates the test infrastructure when creating a module. Inside the tests directory of your module, you’ll find main.nf.test, which contains inputs, test cases, and assertion instructions for nf-test, the Nextflow test framework. Open it and follow through the rest of this section.

Setup sections are used to prepare data before test cases are run, to download data, prepare algorithms and generate awaited return values. In nf-neuro, test datasets are provided via the LOAD_TEST_DATA subworkflow (refer to this section to learn how to find packages). Once you’ve selected the archives and files needed, add a setup section before the test section:

setup {
run("LOAD_TEST_DATA", alias: "LOAD_DATA") {
script "../../../../../subworkflows/nf-neuro/load_test_data/main.nf"
workflow {
"""
input[0] = Channel.from( [ "<archive>" ] )
input[1] = "test.load-test-data"
"""
}
}
}

Replace the <archive> with the name of the one you need and you’ll be able to access the archives within your test suite !

Test cases are defined each in their own test block :

test("example - test") {
when{
process {
...
}
}
then{
...
}
}

A block minimally contains a when (what to test) and a then (what to assert). Inside of them, no need to define the name of the process being tested, or import it. nf-test will take care of that for you when the test will be run, through the process placeholder inside the when block.

The only job of the when block is to define the inputs to supply to your module. You do so by adding a list of inputs inside the process block, as follows :

test("example - test") {
when {
process {
"""
input[0] = LOAD_DATA.out.test_data_directory.map{
test_data_directory -> [
[ id:'test1', single_end:false ], // meta map // -> meta
file("\${test_data_directory}/image.nii.gz"), // -> image file
[] // -> an optional input, not provided
],
[
[ id:'test2', single_end:false ], // meta map // -> meta
file("\${test_data_directory}/image.nii.gz"), // -> image file
file("\${test_data_directory}/mask.nii.gz") // -> an optional input, provided this time
]
}
"""
}
}
then {
...
}
}

The then block is where you define the assertions to check the outputs of your module. The process placeholder contains the outputs of the module in its out attribute, as well as boolean indicators for success and failure. Each assertion is prefixed with the keyword assert, and all are enclosed in an assertAll block :

then {
assertAll(
{ assert process.success },
{ ... }
)
}

Each files produces by the module while tested must go through a reproducibilty check, which is provided by nf-test using the snapshot function :

then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
}

Module configuration is done using a nextflow.config file. You provide it either globally or inside each test case, using the config parameter :

nextflow_process {
name "Test Process <CATEGORY>_<TOOL>"
script "../main.nf"
process "<CATEGORY>_<TOOL>"
config "./nextflow.config"

The nextflow.config file does not exist by default, so you will have to create it if needed. This is not mandatory, except if you have defined optional parameters with task.ext and want to alter their values for some test cases. Refer to this section to see how to scope those parameters to specific tests using selectors.

All modules in nf-neuro contain a stub section, which requires a test case in the tests/main.nf.test file. It may seem redundant, but those tests are essential to the CI/CD infrastructure, to quickly test that downstream workflows work well without executing the module itself. We call this functional testing, if you contribute a subworkflow you’ll learn more about it.

To enable those tests for downstream users, create a new test case, like done previously, specifying inputs and potential configuration. To run the module in stub mode add the tag "stub" and options "-stub-run" clauses before the when block.

test("stub - test") {
tag "stub"
options "-stub-run"
config "./nextflow.config"
when {
process {
"""
input[0] = LOAD_DATA.out.test_data_directory.map{
test_data_directory -> [
[ id:'test_stub', single_end:false ], // meta map // -> meta
file("\${test_data_directory}/image.nii.gz") // -> image file
]
}
"""
}
}
}

For stubs, the only assertion needed is on the version.yml file that gets generated by the module. Here is a complete stub test with assertion on the version.yml file :

test("stub - test") {
tag "stub"
options "-stub-run"
config "./nextflow.config"
when {
process {
"""
input[0] = LOAD_DATA.out.test_data_directory.map{
test_data_directory -> [
[ id:'test_stub', single_end:false ], // meta map // -> meta
file("\${test_data_directory}/image.nii.gz") // -> image file
]
}
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out.versions).match() }
)
}
}

Once you have correctly setup your test cases and made sure the data is available, the test module has to be pre-tested so output files that gets generated are snapshotted correctly before being pushed to nf-neuro.

To do so, run:

Terminal window
nf-core modules test -u <category>/<tool>

All the test case you defined will be run, watch out for errors ! Once everything runs smoothly, look at the snapshot file produced at tests/main.nf.test.snap in your module’s directory and validate that ALL outputs produced by test cases are caught. Their md5sum is critical to ensure future executions of your test produce valid outputs.

A complete example for the denoising/nlmeans module.

Section titled “A complete example for the denoising/nlmeans module.”

Since we used the denoising/nlmeans module as an example from the beginning of this documentation, let’s create a test case for this module as a final example. This example contains only a single test, providing an image and an optional mask as inputs:

nextflow_process {
name "Test Process DENOISING_NLMEANS"
script "../main.nf"
process "DENOISING_NLMEANS"
config "./nextflow.config"
tag "modules"
tag "modules_nfcore"
tag "denoising"
tag "denoising/nlmeans"
tag "subworkflows"
tag "subworkflows/load_test_data"
setup {
run("LOAD_TEST_DATA", alias: "LOAD_DATA") {
script "../../../../../subworkflows/nf-neuro/load_test_data/main.nf"
process {
"""
input[0] = Channel.from( [ "raw_b0.zip", "raw_segmentation.zip" ] )
input[1] = "test.load-test-data"
"""
}
}
}
test("denoising - nlmeans") {
when {
process {
"""
ch_split_test_data = LOAD_DATA.out.test_data_directory
.branch{
b0: it.simpleName == "raw_b0"
segmentation: it.simpleName == "raw_segmentation"
}
ch_b0 = ch_split_test_data.b0.map{
test_data_directory -> [
[ id:'test' ],
file("\${test_data_directory}/b0.nii.gz")
]
}
ch_mask = ch_split_test_data.segmentation.map{
test_data_directory -> [
[ id:'test' ],
file("\${test_data_directory}/brainmask/slices/axial.nii.gz")
]
}
input[0] = ch_b0
.join(ch_mask)
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
}
}
test("stub - test") {
tag "stub"
options "-stub-run"
config "./nextflow.config"
when {
process {
"""
input[0] = LOAD_DATA.out.test_data_directory.map{
test_data_directory -> [
[ id:'test_stub', single_end:false ], // meta map // -> meta
file("\${test_data_directory}/image.nii.gz") // -> image file
]
}
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out.versions).match() }
)
}
}
}