Skip to content

Commit

Permalink
Merge branch 'dev' into use-pytest-fixtures
Browse files Browse the repository at this point in the history
  • Loading branch information
fabianegli authored Mar 31, 2023
2 parents b436669 + 27318f1 commit a935cae
Show file tree
Hide file tree
Showing 17 changed files with 224 additions and 29 deletions.
1 change: 1 addition & 0 deletions .github/workflows/pytest-frozen-ubuntu-20.04.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ jobs:
- name: Downgrade git to the Ubuntu official repository's version
run: |
sudo apt update
sudo apt remove git git-man
sudo add-apt-repository --remove ppa:git-core/ppa
sudo apt install git
Expand Down
24 changes: 24 additions & 0 deletions .github/workflows/stale.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: "Close stale issues and stale PRs"
on:
schedule:
- cron: "30 1 * * 7" # Once a week

jobs:
stale:
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- uses: actions/stale@v7
with:
stale-issue-message: "This issue is stale because it has been inactive for more than 30 days. More information is required. Remove stale label or comment or this will be closed in 20 days."
stale-pr-message: "This PR is stale because it has been open more than 45 days with no activity. Remove stale label or comment if it is still useful. In any case a PR will be automatically closed."
close-issue-message: "This issue was closed because it has been stalled for 20 days with no activity."
days-before-stale: 30
days-before-close: 20
days-before-pr-close: -1
any-of-labels: "awaiting-changes,awaiting-feedback"
exempt-issue-labels: "WIP"
exempt-pr-labels: "WIP"
repo-token: ${{ secrets.GITHUB_TOKEN }}
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,17 @@
- Turn on automatic clean up of intermediate files in `work/` on successful pipeline completion in full-test config ([#2163](https://github.com/nf-core/tools/pull/2163)) [Contributed by @jfy133]
- Add documentation to `usage.md` on how to use `params.yml` files, based on nf-core/ampliseq text ([#2173](https://github.com/nf-core/tools/pull/2173/)) [Contributed by @jfy133, @d4straub]
- Make jobs automatically resubmit for a much wider range of exit codes (now `104` and `130..145`) ([#2170](https://github.com/nf-core/tools/pull/2170))
- Add a stale GHA wich stale + close issues and stale PRs with specific labels ([#2183](https://github.com/nf-core/tools/pull/2183))
- Remove problematic sniffer code in samplesheet_check.py that could give false positive 'missing header' errors ([https://github.com/nf-core/tools/pull/2194]) [Contributed by @Midnighter, @jfy133]
- Consistent syntax for branch checks in PRs ([#2202](https://github.com/nf-core/tools/issues/2202))
- Fixed minor Jinja2 templating bug that caused the PR template to miss a newline
- Updated AWS tests to use newly moved `seqeralabs/action-tower-launch` instead of `nf-core/tower-action`

### Linting

- Update modules lint test to fail if enable_conda is found ([#2213](https://github.com/nf-core/tools/pull/2213))
- Read module lint configuration from `.nf-core.yml`, not `.nf-core-lint.yml` ([#2221](https://github.com/nf-core/tools/pull/2221))
- `nf-core schema lint` now defaults to linting `nextflow_schema.json` if no filename is provided ([#2225](https://github.com/nf-core/tools/pull/2225))

### Modules

Expand All @@ -23,10 +27,13 @@

- Fixing problem when a module included in a subworkflow had a name change from TOOL to TOOL/SUBTOOL ([#2177](https://github.com/nf-core/tools/pull/2177))
- Fix `nf-core subworkflows test` not running subworkflow tests ([#2181](https://github.com/nf-core/tools/pull/2181))
- Add tests for `nf-core subworkflows create-test-yml` ([#2219](https://github.com/nf-core/tools/pull/2219))

### General

- `nf-core modules/subworkflows info` now prints the include statement for the module/subworkflow ([#2182](https://github.com/nf-core/tools/pull/2182)).
- Add a stale GHA wich stale + close issues and stale PRs with specific labels ([#2183](https://github.com/nf-core/tools/pull/2183))
- update minimum version of rich to 13.3.1 ([#2185](https://github.com/nf-core/tools/pull/2185))
- Add the Nextflow version to Gitpod container matching the minimal Nextflow version for nf-core (according to `nextflow.config`) ([#2196](https://github.com/nf-core/tools/pull/2196))
- Use `nfcore/gitpod:dev` container in the dev branch ([#2196](https://github.com/nf-core/tools/pull/2196))
- Replace requests_mock with responses in test mocks ([#2165](https://github.com/nf-core/tools/pull/2165)).
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -612,7 +612,7 @@ The graphical interface is oganzised in groups and within the groups the single

Now you can start to change the parameter itself. The `ID` of a new parameter should be defined in small letters without whitespaces. The description is a short free text explanation about the parameter, that appears if you run your pipeline with the `--help` flag. By clicking on the dictionary icon you can add a longer explanation for the parameter page of your pipeline. Usually, they contain a small paragraph about the parameter settings or a used datasource, like databases or references. If you want to specify some conditions for your parameter, like the file extension, you can use the nut icon to open the settings. This menu depends on the `type` you assigned to your parameter. For integers you can define a min and max value, and for strings the file extension can be specified.

The `type` field is one of the most important points in your pipeline schema, since it defines the datatype of your input and how it will be interpreted. This allows extensive testing prior to starting the pipeline.
The `type` field is one of the most important points in your pipeline schema, since it defines the datatype of your input and how it will be interpreted. This allows extensive testing prior to starting the pipeline.

The basic datatypes for a pipeline schema are:

Expand All @@ -621,7 +621,7 @@ The basic datatypes for a pipeline schema are:
- `integer`
- `boolean`

For the `string` type you have three different options in the settings (nut icon): `enumerated values`, `pattern` and `format`. The first option, `enumerated values`, allows you to specify a list of specific input values. The list has to be separated with a pipe. The `pattern` and `format` settings can depend on each other. The `format` has to be either a directory or a file path. Depending on the `format` setting selected, specifying the `pattern` setting can be the most efficient and time saving option, especially for `file paths`. The `number` and `integer` types share the same settings. Similarly to `string`, there is an `enumerated values` option with the possibility of specifying a `min` and `max` value. For the `boolean` there is no further settings and the default value is usually `false`. The `boolean` value can be switched to `true` by adding the flag to the command. This parameter type is often used to skip specific sections of a pipeline.
For the `string` type you have three different options in the settings (nut icon): `enumerated values`, `pattern` and `format`. The first option, `enumerated values`, allows you to specify a list of specific input values. The list has to be separated with a pipe. The `pattern` and `format` settings can depend on each other. The `format` has to be either a directory or a file path. Depending on the `format` setting selected, specifying the `pattern` setting can be the most efficient and time saving option, especially for `file paths`. The `number` and `integer` types share the same settings. Similarly to `string`, there is an `enumerated values` option with the possibility of specifying a `min` and `max` value. For the `boolean` there is no further settings and the default value is usually `false`. The `boolean` value can be switched to `true` by adding the flag to the command. This parameter type is often used to skip specific sections of a pipeline.

After filling the schema, click on the `Finished` button in the top right corner, this will automatically update your `nextflow_schema.json`. If this is not working, the schema can be copied from the graphical interface and pasted in your `nextflow_schema.json` file.

Expand All @@ -634,13 +634,13 @@ It's important to change the default value of a parameter in the `nextflow.confi
The pipeline schema is linted as part of the main pipeline `nf-core lint` command,
however sometimes it can be useful to quickly check the syntax of the JSONSchema without running a full lint run.

Usage is `nf-core schema lint <schema>`, eg:
Usage is `nf-core schema lint <schema>` (defaulting to `nextflow_schema.json`), eg:

<!-- RICH-CODEX
working_dir: tmp/nf-core-nextbigthing
-->

![`nf-core schema lint nextflow_schema.json`](docs/images/nf-core-schema-lint.svg)
![`nf-core schema lint`](docs/images/nf-core-schema-lint.svg)

## Bumping a pipeline version number

Expand Down
6 changes: 5 additions & 1 deletion nf_core/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -1398,7 +1398,9 @@ def build(dir, no_prompts, web_only, url):

# nf-core schema lint
@schema.command()
@click.argument("schema_path", type=click.Path(exists=True), required=True, metavar="<pipeline schema>")
@click.argument(
"schema_path", type=click.Path(exists=True), default="nextflow_schema.json", metavar="<pipeline schema>"
)
def lint(schema_path):
"""
Check that a given pipeline schema is valid.
Expand All @@ -1408,6 +1410,8 @@ def lint(schema_path):
This function runs as part of the nf-core lint command, this is a convenience
command that does just the schema linting nice and quickly.
If no schema path is provided, "nextflow_schema.json" will be used (if it exists).
"""
schema_obj = nf_core.schema.PipelineSchema()
try:
Expand Down
27 changes: 22 additions & 5 deletions nf_core/modules/lint/main_nf.py
Original file line number Diff line number Diff line change
Expand Up @@ -257,9 +257,26 @@ def check_process_section(self, lines, fix_version, progress_bar):
self.warned.append(("process_standard_label", "Process label unspecified", self.main_nf))
for i, l in enumerate(lines):
url = None
if _container_type(l) == "bioconda":
bioconda_packages = [b for b in l.split() if "bioconda::" in b]
l = l.strip(" '\"")
if _container_type(l) == "conda":
bioconda_packages = [b for b in l.split() if "bioconda::" in b]
match = re.search(r"params\.enable_conda", l)
if match is None:
self.passed.append(
(
"deprecated_enable_conda",
f"Deprecated parameter 'params.enable_conda' correctly not found in the conda definition",
self.main_nf,
)
)
else:
self.failed.append(
(
"deprecated_enable_conda",
f"Found deprecated parameter 'params.enable_conda' in the conda definition",
self.main_nf,
)
)
if _container_type(l) == "singularity":
# e.g. "https://containers.biocontainers.pro/s3/SingImgsRepo/biocontainers/v1.2.0_cv1/biocontainers_v1.2.0_cv1.img' :" -> v1.2.0_cv1
# e.g. "https://depot.galaxyproject.org/singularity/fastqc:0.11.9--0' :" -> 0.11.9--0
Expand Down Expand Up @@ -471,7 +488,7 @@ def _fix_module_version(self, current_version, latest_version, singularity_tag,
for line in lines:
l = line.strip(" '\"")
build_type = _container_type(l)
if build_type == "bioconda":
if build_type == "conda":
new_lines.append(re.sub(rf"{current_version}", f"{latest_version}", line))
elif build_type in ("singularity", "docker"):
# Check that the new url is valid
Expand Down Expand Up @@ -516,8 +533,8 @@ def _get_build(response):

def _container_type(line):
"""Returns the container type of a build."""
if re.search("bioconda::", line):
return "bioconda"
if line.startswith("conda"):
return "conda"
if line.startswith("https://containers") or line.startswith("https://depot"):
# Look for a http download URL.
# Thanks Stack Overflow for the regex: https://stackoverflow.com/a/3809435/713980
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Launch workflow via tower
uses: nf-core/tower-action@v3
uses: seqeralabs/action-tower-launch@v1
# TODO nf-core: You can customise AWS full pipeline tests as required
# Add full size test data (but still relatively small datasets for few samples)
# on the `test_full.config` test runs with only one set of parameters {%- raw %}
Expand Down
2 changes: 1 addition & 1 deletion nf_core/pipeline-template/.github/workflows/awstest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
steps:
# Launch workflow using Tower CLI tool action {%- raw %}
- name: Launch workflow via tower
uses: nf-core/tower-action@v3
uses: seqeralabs/action-tower-launch@v1
with:
workspace_id: ${{ secrets.TOWER_WORKSPACE_ID }}
access_token: ${{ secrets.TOWER_ACCESS_TOKEN }}
Expand Down
24 changes: 24 additions & 0 deletions nf_core/pipeline-template/.github/workflows/stale.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: "Close stale issues and stale PRs"
on:
schedule:
- cron: "30 1 * * 7" # Once a week

jobs:
stale:
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- uses: actions/stale@v7
with:
stale-issue-message: "This issue is stale because it has been inactive for more than 30 days. More information is required. Remove stale label or comment or this will be closed in 20 days."
stale-pr-message: "This PR is stale because it has been open more than 45 days with no activity. Remove stale label or comment if it is still useful. In any case a PR will be automatically closed."
close-issue-message: "This issue was closed because it has been stalled for 20 days with no activity."
days-before-stale: 30
days-before-close: 20
days-before-pr-close: -1
any-of-labels: "awaiting-changes,awaiting-feedback"
exempt-issue-labels: "WIP"
exempt-pr-labels: "WIP"
repo-token: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion nf_core/subworkflow-template/tests/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ nextflow.enable.dsl = 2

include { {{ subworkflow_name|upper }} } from '../../../../subworkflows/{{ org }}/{{ subworkflow_dir }}/main.nf'

workflow test_{{ subworkflow_name }} {
workflow test_{{ component_name_underscore }} {
{% if has_meta %}
input = [
[ id:'test' ], // meta map
Expand Down
6 changes: 3 additions & 3 deletions nf_core/subworkflows/test_yml_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ def scrape_workflow_entry_points(self):
if match:
self.entry_points.append(match.group(1))
if len(self.entry_points) == 0:
raise UserWarning("No workflow entry points found in 'self.module_test_main'")
raise UserWarning(f"No workflow entry points found in '{self.subworkflow_test_main}'")

def build_all_tests(self):
"""
Expand Down Expand Up @@ -195,7 +195,7 @@ def build_single_test(self, entry_point):
).strip()
ep_test["tags"] = [t.strip() for t in prompt_tags.split(",")]

ep_test["files"] = self.get_md5_sums(entry_point, ep_test["command"])
ep_test["files"] = self.get_md5_sums(ep_test["command"])

return ep_test

Expand Down Expand Up @@ -272,7 +272,7 @@ def create_test_file_dict(self, results_dir, is_repeat=False):

return test_files

def get_md5_sums(self, entry_point, command, results_dir=None, results_dir_repeat=None):
def get_md5_sums(self, command, results_dir=None, results_dir_repeat=None):
"""
Recursively go through directories and subdirectories
and generate tuples of (<file_path>, <md5sum>)
Expand Down
4 changes: 2 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@ questionary>=1.8.0
refgenie
requests
requests_cache
rich-click>=1.0.0
rich>=10.7.0
rich-click>=1.6.1
rich>=13.3.1
tabulate
14 changes: 6 additions & 8 deletions tests/modules/create_test_yml.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,19 +11,19 @@
@with_temporary_folder
def test_modules_custom_yml_dumper(self, out_dir):
"""Try to create a yml file with the custom yml dumper"""
yml_output_path = os.path.join(out_dir, "test.yml")
yml_output_path = Path(out_dir, "test.yml")
meta_builder = nf_core.modules.ModulesTestYmlBuilder("test/tool", self.pipeline_dir, False, "./", False, True)
meta_builder.test_yml_output_path = yml_output_path
meta_builder.tests = [{"testname": "myname"}]
meta_builder.print_test_yml()
assert os.path.isfile(yml_output_path)
assert Path(yml_output_path).is_file()


@with_temporary_folder
def test_modules_test_file_dict(self, test_file_dir):
"""Create dict of test files and create md5 sums"""
meta_builder = nf_core.modules.ModulesTestYmlBuilder("test/tool", self.pipeline_dir, False, "./", False, True)
with open(os.path.join(test_file_dir, "test_file.txt"), "w") as fh:
with open(Path(test_file_dir, "test_file.txt"), "w") as fh:
fh.write("this line is just for testing")
test_files = meta_builder.create_test_file_dict(test_file_dir)
assert len(test_files) == 1
Expand All @@ -34,7 +34,7 @@ def test_modules_test_file_dict(self, test_file_dir):
def test_modules_create_test_yml_get_md5(self, test_file_dir):
"""Get md5 sums from a dummy output"""
meta_builder = nf_core.modules.ModulesTestYmlBuilder("test/tool", self.pipeline_dir, False, "./", False, True)
with open(os.path.join(test_file_dir, "test_file.txt"), "w") as fh:
with open(Path(test_file_dir, "test_file.txt"), "w") as fh:
fh.write("this line is just for testing")
test_files = meta_builder.get_md5_sums(command="dummy", results_dir=test_file_dir, results_dir_repeat=test_file_dir)
assert test_files[0]["md5sum"] == "2191e06b28b5ba82378bcc0672d01786"
Expand All @@ -43,9 +43,7 @@ def test_modules_create_test_yml_get_md5(self, test_file_dir):
def test_modules_create_test_yml_entry_points(self):
"""Test extracting test entry points from a main.nf file"""
meta_builder = nf_core.modules.ModulesTestYmlBuilder("bpipe/test", self.pipeline_dir, False, "./", False, True)
meta_builder.module_test_main = os.path.join(
self.nfcore_modules, "tests", "modules", "nf-core", "bpipe", "test", "main.nf"
)
meta_builder.module_test_main = Path(self.nfcore_modules, "tests", "modules", "nf-core", "bpipe", "test", "main.nf")
meta_builder.scrape_workflow_entry_points()
assert meta_builder.entry_points[0] == "test_bpipe_test"

Expand All @@ -55,7 +53,7 @@ def test_modules_create_test_yml_check_inputs(self):
cwd = os.getcwd()
os.chdir(self.nfcore_modules)
meta_builder = nf_core.modules.ModulesTestYmlBuilder("bpipe/test", ".", False, "./", False, True)
meta_builder.module_test_main = os.path.join(self.nfcore_modules, "tests", "modules", "bpipe", "test", "main.nf")
meta_builder.module_test_main = Path(self.nfcore_modules, "tests", "modules", "bpipe", "test", "main.nf")
with pytest.raises(UserWarning) as excinfo:
meta_builder.check_inputs()
os.chdir(cwd)
Expand Down
6 changes: 3 additions & 3 deletions tests/modules/lint.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def test_modules_lint_gitlab_modules(self):
self.mods_install_gitlab.install("multiqc")
module_lint = nf_core.modules.ModuleLint(dir=self.pipeline_dir, remote_url=GITLAB_URL)
module_lint.lint(print_results=False, all_modules=True)
assert len(module_lint.failed) == 0
assert len(module_lint.failed) == 2
assert len(module_lint.passed) > 0
assert len(module_lint.warned) >= 0

Expand All @@ -77,7 +77,7 @@ def test_modules_lint_multiple_remotes(self):
self.mods_install_gitlab.install("multiqc")
module_lint = nf_core.modules.ModuleLint(dir=self.pipeline_dir, remote_url=GITLAB_URL)
module_lint.lint(print_results=False, all_modules=True)
assert len(module_lint.failed) == 0
assert len(module_lint.failed) == 1
assert len(module_lint.passed) > 0
assert len(module_lint.warned) >= 0

Expand All @@ -103,6 +103,6 @@ def test_modules_lint_patched_modules(self):
all_modules=True,
)

assert len(module_lint.failed) == 0
assert len(module_lint.failed) == 1
assert len(module_lint.passed) > 0
assert len(module_lint.warned) >= 0
Loading

0 comments on commit a935cae

Please sign in to comment.