Update pytest requirement from ~=6.2 to ~=7.0#5778
Conversation
|
@dependabot rebase |
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version. - [Release notes](https://github.com/pytest-dev/pytest/releases) - [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst) - [Commits](pytest-dev/pytest@6.2.0...7.0.0) --- updated-dependencies: - dependency-name: pytest dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
01174bb to
628d64b
Compare
|
@cdce8p @Pierre-Sassoulas @jacobtylerwalls @areveny Just tagging some recent contributors as I want to see if anybody recognises the problem: Test runs seem to randomly take longer than 30 minutes. Tests are currently in their third try and I don't see why they would take longer than 30 minutes. In the first run it was Linux 3.8 and 3.9, in second run it was Linux 3.8, 3.9 and 3.10. Anybody recognises such flakiness from other projects that run or need to update to |
|
Could it be due to caching ? |
What type of caching do you mean? Looking at the log of the run we only fail on the |
Right, not caching then.
Yes (latest ubuntu LTS). pytest 7.0 takes 02:43 from 02:10 with pytest 6.2.5. |
|
Here's the result of [-35.14s-]{+35.12s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j1_single_working_checker
[-4.83s call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j10_single_working_checker-]
[-4.81s-]{+4.97s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j1_all_checks_lots_of_files
[-4.15s-]{+4.78s+} call [-tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j10-]
[-4.01s-]{+tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j10_single_working_checker+}
{+3.70s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j10_empty_checker
[-2.43s-]{+3.49s+} call [-tests/test_pylint_runners.py::test_runner[run_epylint]-]
[-2.24s-]{+tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j10+}
{+2.15s+} call tests/test_pylint_runners.py::test_runner_with_arguments[run_epylint]
[-1.84s-]{+2.09s+} call [-tests/test_functional.py::test_functional[nan_comparison_check]-]
[-1.67s-]{+tests/test_pylint_runners.py::test_runner[run_epylint]+}
{+1.77s+} call [-tests/test_self.py::TestRunTC::test_do_not_import_files_from_local_directory-]
[-1.61s-]{+tests/test_functional.py::test_functional[nan_comparison_check]+}
{+1.55s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j1
[-1.54s call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j10-]
[-1.46s-]{+1.41s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_check_parallel_j10
[-1.37s-]{+1.40s+} call [-tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j1-]
[-1.32s-]{+tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j10+}
{+1.35s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j1_empty_checker
[-1.25s-]{+1.32s call tests/test_self.py::TestRunTC::test_do_not_import_files_from_local_directory+}
{+1.29s call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_lots_of_files_j1+}
{+0.96s+} call tests/test_functional.py::test_functional[consider_using_with]
[-1.02s-]{+0.94s+} call tests/test_functional.py::test_functional[stop_iteration_inside_generator]
[-0.96s-]{+0.92s+} call tests/benchmark/test_baseline_benchmarks.py::TestEstablishBaselineBenchmarks::test_baseline_benchmark_j1_all_checks_single_file
[-0.93s-]{+0.89s+} call tests/checkers/unittest_imports.py::TestImportsChecker::test_relative_beyond_top_level_two
[-0.87s-]{+0.78s+} call tests/test_functional.py::test_functional[ungrouped_imports]
[-0.86s call tests/test_functional.py::test_functional[deprecated_methods_py38]-]
[-0.83s-]{+0.77s+} call tests/test_functional.py::test_functional[no_name_in_module][-0.81s call tests/test_regr.py::test_pylint_config_attr-]
0.71s call [-tests/test_functional.py::test_functional[ungrouped_imports_isort_compatible]-]{+tests/test_regr.py::test_crash[file_names0]+}
0.65s call {+tests/test_functional.py::test_functional[ungrouped_imports_isort_compatible]+}
{+0.59s call+} tests/test_functional.py::test_functional[no_member_if_statements]
[-0.63s-]{+0.59s+} call tests/test_functional.py::test_functional[regression_4612_crash_pytest_fixture][-0.62s call tests/test_epylint.py::test_epylint_strange_command-]
[-0.61s call tests/config/test_functional_config_loading.py::test_functional_config_loading[toml/issue_3181/toml_decode_error.toml]-]
[-0.58s call tests/test_epylint.py::test_epylint_good_command-]
0.54s call [-tests/test_functional.py::test_functional[wrong_import_position11]-]{+tests/test_epylint.py::test_epylint_strange_command+}
0.53s call [-tests/test_self.py::TestRunTC::test_do_not_import_files_from_local_directory_with_pythonpath-]{+tests/test_functional.py::test_functional[wrong_import_order]+}
0.52s call [-tests/test_self.py::TestRunTC::test_import_plugin_from_local_directory_if_pythonpath_cwd-] |
|
See pytest-dev/pytest#9652 for discussion about this as well. We have not completely figured out was is causing this, it might not even be |
Pull Request Test Coverage Report for Build 1858073858
💛 - Coveralls |
|
@dependabot recreate |
|
@dependabot recreate Please remove my own commit 😅 |
|
This bot is too clever 😄 |
Updates the requirements on pytest to permit the latest version.
Release notes
Sourced from pytest's releases.
... (truncated)
Commits
3554b83Add note to changelog6ea7f99Prepare release version 7.0.0737b220[7.0.x] releasing: Add template for major releases (#9597)7fa3972[7.0.x] releasing: Always set doc_version (#9590)b304499[7.0.x] Make 'warnings' and 'deselected' in assert_outcomes optional (#9566)f17525d[7.0.x] doc: Add ellipsis to warning usecase list (#9562)0a7be97ci: Bump up timeout (#9565)c17908c[7.0.x] doc: Recategorize 7.0.0 changelog items (#9564)ab549bb[7.0.x] Add missing cooperative constructor changelog (#9563)4b1707f[7.0.x] Autouse linearization graph (#9557)You can trigger a rebase of this PR by commenting
@dependabot rebase.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)