Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,6 @@
"[json]": {
"editor.tabSize": 2
},
"dotnet.defaultSolution": "Razor.sln",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lol, finally!

"omnisharp.defaultLaunchSolution": "Razor.sln"
}
2 changes: 2 additions & 0 deletions azure-pipelines-integration-dartlab.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,8 @@ variables:
value: $(Build.SourcesDirectory)\artifacts\log\$(_configuration)
- name: __VSNeverShowWhatsNew
value: 1
- name: RAZOR_RUN_FLAKY_TESTS
value: 'true'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought you were talking about having a separate pipeline to run all tests? Seems like we're just guaranteeing failures if we turn this on here, which means guaranteeding people will have to investigate them, but we already have issues for these.

I would expect either a different pipeline for all tests, so we can compare the two, or use the FlakyFact attribute only on tests that we know for sure will at least pass some % of the time.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm open for either.

I would expect either a different pipeline for all tests, so we can compare the two, or use the FlakyFact attribute only on tests that we know for sure will at least pass some % of the time.

This was my initial thought. Running the test pipeline and having failure rates generated is worth it. If we want a separate pipeline that's fine too, but I don't much see the point. I figure Conditional Skip = may be enabled at some point. It's the difference between "might work" and "definitely won't work". 0% success rate should just be disabled or fixed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess I agree with you, I'm just not confident that our existing skipped tests are all there because of flakiness, and not some other issue. We'll find out! 😀

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, we're off to a good start!

image


stages:
- template: \stages\visual-studio\agent.yml@DartLabTemplates
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ namespace Microsoft.VisualStudio.Razor.IntegrationTests;

public class DiagnosticTests : AbstractRazorEditorTest
{
[IdeFact(Skip = "https://github.com/dotnet/razor/issues/8150")]
[FlakyFact(Issue = "https://github.com/dotnet/razor/issues/8150")]
public async Task Diagnostics_ShowErrors_Razor()
{
// Arrange
Expand Down Expand Up @@ -48,7 +48,7 @@ public void Function(){
});
}

[IdeFact(Skip = "https://github.com/dotnet/razor/issues/8150")]
[FlakyFact(Issue = "https://github.com/dotnet/razor/issues/8150")]
public async Task Diagnostics_ShowErrors_Html()
{
// Arrange
Expand Down Expand Up @@ -80,7 +80,7 @@ await TestServices.Editor.SetTextAsync(@"
});
}

[IdeFact(Skip = "https://github.com/dotnet/razor/issues/8150")]
[FlakyFact(Issue = "https://github.com/dotnet/razor/issues/8150")]
public async Task Diagnostics_ShowErrors_CSharp()
{
// Arrange
Expand Down Expand Up @@ -112,7 +112,7 @@ await TestServices.Editor.SetTextAsync(@"
});
}

[IdeFact(Skip = "https://github.com/dotnet/razor/issues/8150")]
[FlakyFact(Issue = "https://github.com/dotnet/razor/issues/8150")]
public async Task Diagnostics_ShowErrors_CSharp_NoDocType()
{
// Why this test, when we have the above test, and they seem so similar, and we also have Diagnostics_ShowErrors_CSharpAndHtml you ask? Well I'll tell you!
Expand Down Expand Up @@ -152,7 +152,7 @@ await TestServices.Editor.SetTextAsync(@"
});
}

[IdeFact(Skip = "https://github.com/dotnet/razor/issues/8150")]
[FlakyFact(Issue = "https://github.com/dotnet/razor/issues/8150")]
public async Task Diagnostics_ShowErrors_CSharpAndHtml()
{
// Arrange
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
// Copyright (c) .NET Foundation. All rights reserved.
// Licensed under the MIT license. See License.txt in the project root for license information.

using System;
using Xunit;

namespace Microsoft.VisualStudio.Razor.IntegrationTests;

[AttributeUsage(AttributeTargets.Method, AllowMultiple = false)]
public class FlakyFactAttribute : IdeFactAttribute
{
private readonly Lazy<bool> _runFlakyTests = new(() => Environment.GetEnvironmentVariable("RAZOR_RUN_FLAKY_TESTS")?.ToLower() == "true");

public FlakyFactAttribute()
{
}

private string _issue = "";
public string Issue
{
get => _issue;
set
{
_issue = value;

if (!_runFlakyTests.Value)
{
#pragma warning disable CS0618
Skip = _issue;
#pragma warning restore CS0618
}
}
}

[Obsolete("Use Issue instead of Skip")]
public new string Skip
{
get => base.Skip;
set => base.Skip = value;
}
}