-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP Change outcome to 'passed' for xfail unless it's strict #1795
WIP Change outcome to 'passed' for xfail unless it's strict #1795
Conversation
Hey @hackebrot I think you are in the right direction; I took a quick look on the failing junitxml tests, but I believe they were not entirely correct in the first place, or at least the new meaning makes more sense. I will review this more carefully tomorrow as I'm short on time right now. |
Thanks @nicoddemus! 🙇 Have a nice evening. 😄 |
@@ -260,7 +266,7 @@ def pytest_report_teststatus(report): | |||
if hasattr(report, "wasxfail"): | |||
if report.skipped: | |||
return "xfailed", "x", "xfail" | |||
elif report.failed: | |||
elif report.passed: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fabioz I think this will mean you no longer have to do special handling for this in pydev_runfiles_pytest2.py, correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nicoddemus do you think we need to check for strict_xfail
here too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No need: strict xfails now have report set to failed
and will appear as a normal failure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay, that's what I thought. Thanks!
We should also add somewhere in the docs that the |
@@ -245,7 +246,12 @@ def pytest_runtest_makereport(item, call): | |||
rep.outcome = "skipped" | |||
rep.wasxfail = evalxfail.getexplanation() | |||
elif call.when == "call": | |||
rep.outcome = "failed" # xpass outcome | |||
strict_default = item.config.getini('xfail_strict') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I think this is the more correct implementation of "strict xfail" than the one I did... with this, I think we don't even need check_strict_xfail
(which is used in pytest_pyfunc_call
) in this module anymore at all, because failure/passing will be handled here which seems more appropriate now. 😄
Hey @nicoddemus! Thank you so much for your thorough code review. I am currently on vacation (on some Scottish island 😉). Feel free to pick this up. That'd be awesome, so get we can get 3.0 released soon-ish 😁 |
Hey @nicoddemus, I've added some changes. Please let me know your thoughts! 😄 |
@hackebrot thanks! The changes look good, we just have to fix the |
Should an
(It's a failure in strict-mode) @pytest-dev/core what do you think? 😄 |
|
its clearly a "passed" test when being non-strict |
I don't think it should be |
that would be a bug then ^^ |
Okay, cool 😄 so to conclude: Non-strict mode (default)
Strict mode
Everyone happy? |
That's what I meant by:
😁
On that table, what does |
Test Code# test_nope.py
import pytest
@pytest.mark.xfail
def test_fail_non_strict():
assert False
@pytest.mark.xfail
def test_pass_non_strict():
assert True
@pytest.mark.xfail(strict=True)
def test_fail_strict():
assert False
@pytest.mark.xfail(strict=True)
def test_pass_strict():
assert True # conftest.py
def pytest_runtest_logreport(report):
print()
print("report.nodeid '{}'".format(report.nodeid))
print("report.when '{}'".format(report.when))
print("report.outcome '{}'".format(report.outcome))
print("report.wasxfail '{}'".format(getattr(report, 'wasxfail', 'NOT SET')))
print() |
master
|
Here's the output using @hackebrot's branch (only for
And here's the junit XML file: <?xml version="1.0" encoding="utf-8"?>
<testsuite errors="0" failures="1" name="pytest" skips="2" tests="4" time="0.020">
<testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="2" name="test_fail_non_strict" time="0.0">
<skipped message="expected test failure"></skipped>
</testcase>
<testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="6" name="test_pass_non_strict" time="0.0"></testcase>
<testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="10" name="test_fail_strict" time="0.0">
<skipped message="expected test failure"></skipped>
</testcase>
<testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="14" name="test_pass_strict" time="0.0">
<failure message="[XPASS(strict)] ">[XPASS(strict)]</failure>
</testcase>
</testsuite> Parsing the XML file, we can see:
IMHO the XML now looks better because a non-strict xpass now reports PASSED, so the tests in For comparison, here's the same summary for the
|
That sounds right to me - so what's left to do here? I'm guessing @hackebrot needs to adjust the existing tests and then this is ready for review? |
I find it slightly confusing that I'd much rather expect the following:
|
@hackebrot Hmm, you do have a point there. On a second thought, I agree (except with |
I am not entirely sure if this is the right way to do it.
This is very much WIP as the tests will be failing.
Can anybody help me out with this one, please 😃
Resolves #1546