Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong test result state after retry #4

Open
marks234 opened this issue Jan 31, 2023 · 19 comments
Open

Wrong test result state after retry #4

marks234 opened this issue Jan 31, 2023 · 19 comments

Comments

@marks234
Copy link
Contributor

If a test is repeated after a timeout, the repetition is included in the result with the status "not executed". Console log and error log are missing. However, attachments are included.

image

@ShuiRuTian
Copy link
Owner

Pretty interesting.

Seems like Azure treats the TRX cases whose result is "Timeout" as "not executed". And about the log and error log, feels like it might be related to this too.

Need some time to investigate how C# test framework handles this...

@ShuiRuTian
Copy link
Owner

Verified that MSTest(C# test framework from MS) handles timeout(through C# attribute) as failed(trx).

Will follow this behavior. But it's pretty late today, will submit a PR tomorrow.

@marks234
Copy link
Contributor Author

marks234 commented Feb 6, 2023

Nice, I could try to get the results from the JUnit reporter if this helps...

@ShuiRuTian
Copy link
Owner

This issue should be fixed with 1.0.4 version.

@marks234
Copy link
Contributor Author

Sorry, but it doesn't seem to work in 1.0.4 either.

Here is what i get for a test case with timeout, retry and again timeout with Junit reporter

(header omitted)

<testsuite name="spec/epic1/feature1.spec.ts" timestamp="1676456550554" hostname="" tests="2" failures="1" skipped="0" time="432.998" errors="0">
  <testcase name="feature1 should work" classname="[Microsoft Edge] › spec/epic1/feature1.spec.ts › epic1 › feature1 should work" time="399.998">
  <properties>
    <property name="reportAttachments">
      <item name="screenshot">
	    ...omitted...
      </item>
      <item name="trace">
	    ...omitted...
      </item>
      <item name="screenshot">
	    ...omitted...
      </item>
    </property>
  </properties>
<failure message="feature1.spec.ts:23:7 feature1 should work" type="FAILURE">
<![CDATA[  [Microsoft Edge] › spec/epic1/feature1.spec.ts:23:7 › epic1 › feature1 should work

    Test timeout of 200000ms exceeded.

    locator.click: Target closed
    =========================== logs ===========================
    waiting for locator('button').filter({ hasText: 'add_circle_outline' })
    ============================================================

       at fixtures/pages/configuration-page.ts:128

      126 |     await this.inputAlternativeText.fill(alternativeText);
      127 |     await this.page.waitForTimeout(1000);
    > 128 |     await this.buttonSave.click();
          |                           ^
      129 |     await this.page.waitForTimeout(6000);
      130 |   }
      131 |

        at ... (stack-trace omitted)

    Pending operations:
      - locator.click at tests/fixtures/pages/configuration-page.ts:128:27


    attachment #1: screenshot (image/png) ----------------------------------------------------------
    path/to/test-failed-1.png
    ------------------------------------------------------------------------------------------------

    Retry #1 ---------------------------------------------------------------------------------------

    Test timeout of 200000ms exceeded.

    locator.click: Target closed
    =========================== logs ===========================
    waiting for locator('button').filter({ hasText: 'add_circle_outline' })
    ============================================================

       at fixtures/pages/configuration-page.ts:128

      126 |     await this.inputAlternativeText.fill(alternativeText);
      127 |     await this.page.waitForTimeout(1000);
    > 128 |     await this.buttonSave.click();
          |                           ^
      129 |     await this.page.waitForTimeout(6000);
      130 |   }
      131 |

        at ... (stack-trace omitted)

    Pending operations:
      - locator.click at tests/fixtures/pages/configuration-page.ts:128:27


    attachment #1: trace (application/zip) ---------------------------------------------------------
    path/to/trace.zip
    Usage:

        npx playwright show-trace path/to/trace.zip

    ------------------------------------------------------------------------------------------------

    attachment #2: screenshot (image/png) ----------------------------------------------------------
    path/to/test-failed-1.png
    ------------------------------------------------------------------------------------------------
]]>
</failure>
<system-err>
<![CDATA[(node:396) Warning: Setting the NODE_TLS_REJECT_UNAUTHORIZED environment variable to '0' makes TLS connections and HTTPS requests insecure by disabling certificate verification.
(Use `node --trace-warnings ...` to show where the warning was created)
]]>
</system-err>
</testcase>
</testsuite>

(other cases omitted)

and that's the trx result of the same run:

(header omitted)
	<ResultSummary outcome="Failed">
        <Counters total="29" executed="29" passed="25" failed="4" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0"/>
    </ResultSummary>
	
    <Results>
	
		(...)
	
        <UnitTestResult testName="feature1 should work" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="cccfb860-c521-5a3b-b1b9-9a0851f3f28d" executionId="d69f2c00-a1d7-44e9-a773-d4a18ac2ae13" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="e2e-deployment-f799496b4-98nk2" startTime="2023-02-15T10:29:12.363Z" endTime="2023-02-15T10:32:32.362Z" duration="00:03:19.999">
            <ResultFiles>
                <ResultFile path="path/to/test-failed-1.png"/>
            </ResultFiles>
        </UnitTestResult>
        
        <UnitTestResult testName="feature1 should work" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="cccfb860-c521-5a3b-b1b9-9a0851f3f28d" executionId="55978b2b-4597-4af5-8eaf-d7d869a4aeda" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="e2e-deployment-f799496b4-98nk2" startTime="2023-02-15T10:32:33.867Z" endTime="2023-02-15T10:35:53.866Z" duration="00:03:19.999">
            <Output>
                <StdErr>
                    <![CDATA[(node:396) Warning: Setting the NODE_TLS_REJECT_UNAUTHORIZED environment variable to '0' makes TLS connections and HTTPS requests insecure by disabling certificate verification.
(Use `node --trace-warnings ...` to show where the warning was created)
]]>
                </StdErr>
            </Output>
            <ResultFiles>
                <ResultFile path="path/to/trace.zip"/>
                <ResultFile path="path/to/test-failed-1.png"/>
            </ResultFiles>
        </UnitTestResult>
		
		(...)
		
	</Results>

(others omitted)

Three problems:

  • the stack trace is missing in both tries
  • the counters do not match. timeout counter is zero
  • result in azure is not shown correctly

@ShuiRuTian
Copy link
Owner

@marks234

For $1, kind of weird, I could not reproduce the issue. I just publish 1.0.5, could you please have a try with it?

NOTE: This part seems not part of stack trace. I agree it's pretty good, and we should have it.

      126 |     await this.inputAlternativeText.fill(alternativeText);
      127 |     await this.page.waitForTimeout(1000);
    > 128 |     await this.buttonSave.click();
          |                           ^
      129 |     await this.page.waitForTimeout(6000);
      130 |   }
      131 |

For $2. This behavior is expected now, as far as I think. When a test case is timeout, we added timeout counter. But when we upload the test result to Azure, it treats "timeout" as "Not executed". So now we add fail counter if the test case is timeout.

For $3, I open an issue in Azure Devops Doc repo: https://github.com/MicrosoftDocs/azure-devops-docs/issues/13085. I have tried many ways, but I still have no idea how to show the result correctly. Do you mean Junit result is correct? If so, could you please provide a minimal repo?

@AndrewCraswell
Copy link

Is it possible to add a property to the reporter config to output tests that have timed out with a failed status instead? Seems like "Timout" status displays in the Azure Devops report as not executed, and hides those tests under the "Other" filter. But we'd like to surface them as failures so that ADO will track historical fails over periods of time.

@ShuiRuTian
Copy link
Owner

@AndrewCraswell Thanks for the feedback.

This is a pretty good suggestion, supporting Azure Devopes Pipeline is the main motivation of this package.

And I do think it deserves to be the default behavior, so I change it in 1.0.5.

Could you please check the version you are using?

@AndrewCraswell
Copy link

AndrewCraswell commented Feb 26, 2023

I am using ^1.0.5 version. The counters have been updated to show fail, but it doesn't seem like Azure DevOps uses the counters at all in the report I think? ADO is showing: 1 Run(s) Completed (1 Passed, 0 Failed). But I would like it to show each individual test as failed so that it will track "failing since" columns, and show the run itself as failed.

Here is the .trx which is generated:

<TestRun
	xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010" id="fa761c49-7f5b-4cdd-888e-ae83d0ce3429" name="fv-az161-350\vsts 2023-02-25T12:09:09.095Z" runUser="fv-az161-350\vsts">
	<ResultSummary outcome="Failed">
		<Counters total="12" executed="12" passed="8" failed="4" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0"/>
	</ResultSummary>
	<Results>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="8d5b327f-0ef8-5cc3-a59a-1551f3ee0fc7" executionId="1fd5fa3f-78b5-41bf-85f9-82b3d9e7b236" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:09:09.565Z" endTime="2023-02-25T12:09:33.795Z" duration="00:00:24.230"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="6e9c69cc-4edc-5c70-a744-9dbf61bd3674" executionId="e2b315e2-d884-4cde-ae55-a75449c982ba" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:09:38.240Z" endTime="2023-02-25T12:09:58.262Z" duration="00:00:20.022"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="99569fc9-6625-53ff-be4d-bd3bddb3fb46" executionId="45b60f3b-8e59-4234-8698-da5cd7ad6398" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:09:58.271Z" endTime="2023-02-25T12:10:14.967Z" duration="00:00:16.696"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="d3f3df68-86cd-58b3-8471-267297ba020c" executionId="4bb0c0bf-8a2a-4316-be37-6bcdc254b8b4" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:10:14.969Z" endTime="2023-02-25T12:10:36.828Z" duration="00:00:21.859"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="a783324a-6e8a-577a-891f-0317a48c467d" executionId="fb93b602-688a-4dc6-a296-d35a759db67d" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:10:36.836Z" endTime="2023-02-25T12:10:45.989Z" duration="00:00:09.153"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="38db5a2d-e2ff-50b7-b70f-40e37cee3d79" executionId="84e2326d-1a5e-4127-a070-09f5b1bc6933" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:10:45.999Z" endTime="2023-02-25T12:10:57.297Z" duration="00:00:11.298"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="f2c57e85-eeb9-5689-ae38-f94b89d62b6b" executionId="189d3a1e-c246-424b-8379-cc2a6727f8f2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:10:57.299Z" endTime="2023-02-25T12:11:23.081Z" duration="00:00:25.782"/>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="9e78ab84-4371-5c08-bc40-a7accf333c15" executionId="6561dbdc-e050-4a0b-828f-3e7e6c900a27" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="fv-az161-350" startTime="2023-02-25T12:11:23.083Z" endTime="2023-02-25T12:12:23.082Z" duration="00:00:59.999">
			<ResultFiles>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
			</ResultFiles>
		</UnitTestResult>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="9e78ab84-4371-5c08-bc40-a7accf333c15" executionId="f497f904-75ae-4ced-a370-db93f7e7f474" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="fv-az161-350" startTime="2023-02-25T12:12:29.531Z" endTime="2023-02-25T12:13:29.531Z" duration="00:01:00.000">
			<ResultFiles>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
			</ResultFiles>
		</UnitTestResult>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="5934937f-0d67-51ac-8377-7accc19b3696" executionId="f66ad522-94ab-40cb-9ffc-fe202f437c5c" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="fv-az161-350" startTime="2023-02-25T12:13:35.761Z" endTime="2023-02-25T12:14:35.761Z" duration="00:01:00.000">
			<ResultFiles>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
			</ResultFiles>
		</UnitTestResult>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="5934937f-0d67-51ac-8377-7accc19b3696" executionId="9c660a76-8c43-487b-b103-9937a0d7b1b2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Timeout" computerName="fv-az161-350" startTime="2023-02-25T12:14:42.468Z" endTime="2023-02-25T12:15:42.468Z" duration="00:01:00.000">
			<ResultFiles>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
				<ResultFile path="<redacted>"/>
			</ResultFiles>
		</UnitTestResult>
		<UnitTestResult testName="<redacted>" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testId="8b39e3e2-03bf-5790-91a8-3b1ff775ce62" executionId="1a59e587-b831-4eff-a2d3-904f126e1d54" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" outcome="Passed" computerName="fv-az161-350" startTime="2023-02-25T12:15:48.925Z" endTime="2023-02-25T12:16:03.405Z" duration="00:00:14.480"/>
	</Results>
	<TestDefinitions>
		<UnitTest id="8d5b327f-0ef8-5cc3-a59a-1551f3ee0fc7" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; createM365GroupsFlow.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="6e9c69cc-4edc-5c70-a744-9dbf61bd3674" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; createSecurityGroupsFlow.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="99569fc9-6625-53ff-be4d-bd3bddb3fb46" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; createUsersFlow.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="d3f3df68-86cd-58b3-8471-267297ba020c" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; createUsersFlow.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="a783324a-6e8a-577a-891f-0317a48c467d" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; downloadLakeData.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="38db5a2d-e2ff-50b7-b70f-40e37cee3d79" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; onboardOffboardScenarios.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="f2c57e85-eeb9-5689-ae38-f94b89d62b6b" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; onboardOffboardScenarios.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="9e78ab84-4371-5c08-bc40-a7accf333c15" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; onboardOffboardScenarios.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="5934937f-0d67-51ac-8377-7accc19b3696" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; onboardOffboardScenarios.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
		<UnitTest id="8b39e3e2-03bf-5790-91a8-3b1ff775ce62" name="<redacted>" storage="<redacted>">
			<TestMethod className="Microsoft Edge &gt; onboardOffboardScenarios.e2e.ts" codeBase="<redacted>" name="<redacted>"/>
		</UnitTest>
	</TestDefinitions>
	<TestEntries>
		<TestEntry testId="8d5b327f-0ef8-5cc3-a59a-1551f3ee0fc7" executionId="1fd5fa3f-78b5-41bf-85f9-82b3d9e7b236" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="6e9c69cc-4edc-5c70-a744-9dbf61bd3674" executionId="e2b315e2-d884-4cde-ae55-a75449c982ba" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="99569fc9-6625-53ff-be4d-bd3bddb3fb46" executionId="45b60f3b-8e59-4234-8698-da5cd7ad6398" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="d3f3df68-86cd-58b3-8471-267297ba020c" executionId="4bb0c0bf-8a2a-4316-be37-6bcdc254b8b4" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="a783324a-6e8a-577a-891f-0317a48c467d" executionId="fb93b602-688a-4dc6-a296-d35a759db67d" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="38db5a2d-e2ff-50b7-b70f-40e37cee3d79" executionId="84e2326d-1a5e-4127-a070-09f5b1bc6933" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="f2c57e85-eeb9-5689-ae38-f94b89d62b6b" executionId="189d3a1e-c246-424b-8379-cc2a6727f8f2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="9e78ab84-4371-5c08-bc40-a7accf333c15" executionId="6561dbdc-e050-4a0b-828f-3e7e6c900a27" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="9e78ab84-4371-5c08-bc40-a7accf333c15" executionId="f497f904-75ae-4ced-a370-db93f7e7f474" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="5934937f-0d67-51ac-8377-7accc19b3696" executionId="f66ad522-94ab-40cb-9ffc-fe202f437c5c" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="5934937f-0d67-51ac-8377-7accc19b3696" executionId="9c660a76-8c43-487b-b103-9937a0d7b1b2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
		<TestEntry testId="8b39e3e2-03bf-5790-91a8-3b1ff775ce62" executionId="1a59e587-b831-4eff-a2d3-904f126e1d54" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d"/>
	</TestEntries>
	<TestLists>
		<TestList id="8c84fa94-04c1-424b-9868-57a2d4851a1d" name="<redacted>"/>
		<TestList id="19431567-8539-422a-85d7-44ee4e166bda" name="<redacted>"/>
	</TestLists>
	<Times creation="2023-02-25T12:09:09.095Z" finish="2023-02-25T12:16:03.584Z"/>
</TestRun>

And here is how they show in the Test Report on the pipeline:

image

@AndrewCraswell
Copy link

AndrewCraswell commented Feb 26, 2023

It's a hard call. I feel like not everyone will want it categorized as we do, since the tests did timeout, but now they are not tracked as Timeouts in the counters section. Some people would feel that is a bug in the TRX output, which would be technically correct. That's why I wondered if it should be behind a configuration value like markTimoutsAsFailed: boolean. This would also change the individual status of each test from Timeout to Failed, which would then make the count summary accurate.

@AndrewCraswell
Copy link

I found some backend pipelines that were generating the .trx files using the DotNet CLI. Based on what I saw in the test report it doesn't appear that ADO is using the ResultSummary[outcome] or the Counters section at all. They are scanning all the UnitTestResults[outcome] and generating their own counters and test status...

@AndrewCraswell
Copy link

AndrewCraswell commented Feb 26, 2023

Console log and error log are missing. However, attachments are included.

I think this is because playwright-trx-reporter is only outputting the stack trace and error if the test is marked with a Failed outcome, isn't it? So, because the outcome is Timeout in this case, it is not expected to have a stack trace and error message. This actually seems valid to me, the problem is we are still wanting to categorize Timeout outcomes as Failed instead, which will require updating the UnitTestResults[outcome] attribute.

@marks234
Copy link
Contributor Author

I totally agree. It seems that the timeout state isn't handled properly from ADO. Thus, I'd like to see timeouts handled als failed too. But a configuration option would be nice too.

@ShuiRuTian
Copy link
Owner

1.0.6 is published, thank you all.

In 1.0.6, I still decide to change the default behavior. I found MSTest test framework for C# also treats "timeout" as "failed", so it seems no one is really using "timeout" outcome?

@AndrewCraswell
Copy link

AndrewCraswell commented Feb 28, 2023

I will bump our version and test over the next few days. My only concern is how retries will be marked. If a test fails, but the test is retried and passes, I think the test run should be marked as a pass rather than a fail.

I didn't notice the TRX file identifying tests as retried. I was hoping to gather some more information and open a new feature request for that logic so that we could make use of ADO's native flaky test detection.

@AndrewCraswell
Copy link

AndrewCraswell commented Feb 28, 2023

Seems like this is partially working. Timeouts are being marked as Failed, and subsequent retries can succeed. I still am not seeing the stack trace or error message though. I think maybe it's best to close this issue and I can open another for the stack trace piece?

image

@ChrystianC
Copy link

ChrystianC commented Nov 21, 2023

Any progress?

@ShuiRuTian
Copy link
Owner

@ChrystianC sorry for the late, it should already been fixed in latest version.

If not, please provide a minimal reproduce repo.

@ChrystianC
Copy link

I tested package not long ago. I bump into same issue as @AndrewCraswell, when test timeouts. There is no
Error massage or Stack trace.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants