Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add test summary status check, handle missing tests #25

Merged
merged 3 commits into from
Mar 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 18 additions & 4 deletions .github/workflows/nextflow-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

# Inspired by https://blog.aspect.dev/github-actions-dynamic-matrix
jobs:
discover-tests:
discover:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
Expand All @@ -25,18 +25,21 @@ jobs:
with open(os.environ.get("GITHUB_OUTPUT"),
mode="w", encoding="utf-8") as outfile:
outfile.write(f"testfiles={json.dumps(testfiles)}\n")
outfile.write(f"num_tests={json.dumps(bool(testfiles))}\n")

outputs:
testfiles: ${{ steps.listfiles.outputs.testfiles }}
num_tests: ${{ steps.listfiles.outputs.num_tests }}

run-test:
run:
runs-on: ubuntu-latest
needs: discover-tests
needs: discover
if: ${{ fromJSON(needs.discover.outputs.num_tests) }}

strategy:
fail-fast: false
matrix:
testfile: ${{ fromJSON(needs.discover-tests.outputs.testfiles) }}
testfile: ${{ fromJSON(needs.discover.outputs.testfiles) }}

steps:
- uses: actions/checkout@v4
Expand Down Expand Up @@ -67,3 +70,14 @@ jobs:
name: ${{ steps.dockertest.outputs.archive_key }}
path: ${{ steps.dockertest.outputs.archive_path }}
if: ${{ !cancelled() }}

summary:
runs-on: ubuntu-latest
needs: run
if: ${{ !cancelled() }}

steps:
- uses: actions/github-script@v7
if: ${{ needs.run.result != 'success' && needs.run.result != 'skipped' }}
with:
script: core.setFailed('Tests failed!')
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.htm
- Add reusable workflow to run Nextflow regression tests
- Add new `nextflow-config-tests` Docker image linked to this repository
- Add `nfconfigtest` script to run regression tests locally
- Add `summary` check for Nextflow tests

### Changed
- Update output file name to explicitly specify `submodules`
Expand All @@ -26,6 +27,7 @@ This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.htm
- No longer fail on broken links
- Properly format headings with embedded markdown
- Handle constructing anchor links for repeated headings
- No longer fail when no Nextflow tests are discovered

---

Expand Down
6 changes: 4 additions & 2 deletions run-nextflow-tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,13 +87,15 @@ Once enabled on a pipeline, this Action will perform checks like the following o

![Image of status checks](docs/status_checks.png)

The `discover-tests` check should always succeed. It creates one `run-test` check per discovered test file, each of which can succeed or fail independently.
The `discover` check should always succeed. It creates one `run` check per discovered test file, each of which can succeed or fail independently.

The `summary` check will succeed if all `run`s succeed, or if no test files were discovered. This makes it suitable for use as a [required status check](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/collaborating-on-repositories-with-code-quality-features/about-status-checks).

Any differences with the expected results are displayed as annotations in the pull request's code view:

![Diff annotation](docs/annotation.png)

Each `run-test` check saves a new and valid test file as an artifact. This makes it easy to update failing tests - you can simply overwrite the failing test with the artifact and commit the changes (after verifying that they are expected).
Each `run` check saves a new and valid test file as an artifact. This makes it easy to update failing tests - you can simply overwrite the failing test with the artifact and commit the changes (after verifying that they are expected).

![Artifact files](docs/artifacts.png)

Expand Down
Binary file modified run-nextflow-tests/docs/status_checks.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading