Skip to content

Conversation

@RussKie
Copy link
Contributor

@RussKie RussKie commented Oct 17, 2023

Report code coverage increase/decrease back to the PRs.

Microsoft Reviewers: Open in CodeFlow

@ghost ghost assigned RussKie Oct 17, 2023
@RussKie RussKie closed this Oct 30, 2023
@RussKie RussKie reopened this Oct 30, 2023
@RussKie RussKie force-pushed the githubcomment branch 2 times, most recently from aef5f3f to 7dacf50 Compare October 30, 2023 11:26
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@dotnet-comment-bot

This comment was marked as outdated.

@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@RussKie RussKie changed the title Test GitHub comment Report code coverage back to GitHub Oct 30, 2023
@RussKie RussKie marked this pull request as ready for review October 30, 2023 11:56
@dotnet dotnet deleted a comment from dotnet-comment-bot Oct 30, 2023
@sharwell
Copy link

code coverage increase/decrease

This isn't a helpful metric. For pull request reviews, there are two valuable pieces of information:

  1. What lines of code are touched by the pull request, but are not covered? This helps a reviewer ask better questions about special cases that have been implemented in code.
  2. What changes occurred to lines of code that are not touched by the pull request? This helps a reviewer catch cases where high-level tests might no longer be covering interesting parts of an algorithm due to refactoring or the introduction of special cases. There are also a variety of other situations this can help with, but they are less common.

Other metrics, including any metrics that do not separate lines of code changed by a PR from other lines, are actively harmful due to their misleading nature. I would recommend not posting such metrics anywhere for review, but especially inside a PR.

@geeknoid
Copy link
Member

@sharwell Are your recommendations different if the policy for the repo is that all code should have 100% test coverage?

@Tratcher
Copy link
Member

Tratcher commented Oct 30, 2023

Other metrics, including any metrics that do not separate lines of code changed by a PR from other lines, are actively harmful due to their misleading nature. I would recommend not posting such metrics anywhere for review, but especially inside a PR.

Really? The high-level metrics seem helpful to me, it's good to know if coverage is improving or regressing before digging into a long report. I would also want easy access to the report for details (e.g. linked from the same comment).

@RussKie
Copy link
Contributor Author

RussKie commented Oct 30, 2023

code coverage increase/decrease

This isn't a helpful metric.

It is valuable in the context of this repo which has established guidelines and procedures around code coverage. One of which - the coverage can't go down. And it's down, then it's up to the author to figure out what went wrong.
And when it's up - the author should record the new value in the project file.

I would also want easy access to the report for details (e.g. linked from the same comment).

Good idea. I was too tired last night to implement it.

@dotnet-comment-bot

This comment was marked as outdated.

@Tratcher
Copy link
Member

Table format needs work 😆

@dotnet-comment-bot

This comment was marked as outdated.

@RussKie
Copy link
Contributor Author

RussKie commented Oct 30, 2023

Table format needs work 😆

Indeed 😉

@RussKie RussKie marked this pull request as draft October 30, 2023 20:48
@dotnet-comment-bot

This comment was marked as outdated.

@dotnet-comment-bot
Copy link
Collaborator

dotnet-comment-bot commented Oct 30, 2023

‼️ Found issues ‼️

Project Coverage Type Expected Actual
Microsoft.Extensions.Http.AutoClient Line 98 95.16 🔻

🎉 Good job! The coverage increased 🎉
Update MinCodeCoverage in the project files.

Project Expected Actual
Microsoft.Extensions.Diagnostics.ResourceMonitoring 98 100
Microsoft.Extensions.Diagnostics.Extra 92 93

Full code coverage report: https://dev.azure.com/dnceng-public/public/_build/results?buildId=455807&view=codecoverage-tab

@RussKie RussKie changed the base branch from main to release/8.0 October 30, 2023 22:03
@RussKie RussKie marked this pull request as ready for review October 30, 2023 22:03
@RussKie
Copy link
Contributor Author

RussKie commented Oct 30, 2023

Ready now. Rebased on top of the release/8.0.

@dotnet-comment-bot
Copy link
Collaborator

🎉 Good job! The coverage increased 🎉
Update MinCodeCoverage in the project files.

Project Expected Actual
Microsoft.Extensions.Telemetry 92 93
Microsoft.Extensions.Diagnostics.ResourceMonitoring 98 100
Microsoft.AspNetCore.Diagnostics.Middleware 99 100

Full code coverage report: https://dev.azure.com/dnceng-public/public/_build/results?buildId=455823&view=codecoverage-tab

@RussKie RussKie merged commit 1e2ad7c into dotnet:release/8.0 Oct 31, 2023
@RussKie RussKie deleted the githubcomment branch October 31, 2023 02:46
@sharwell
Copy link

sharwell commented Oct 31, 2023

@sharwell Are your recommendations different if the policy for the repo is that all code should have 100% test coverage?

"100% test coverage" can have many definitions. Regardless, the question seems more academic than practical. My suggestion was derived based on successful outcomes observed at scale, particularly surrounding the fact that many people have limited historical exposure to code coverage tools and almost always have been incorrectly instructed around aggregate metrics being a relevant indicator of [something] when in reality the differential code coverage is the only report you generally need to see.

Really? The high-level metrics seem helpful to me, it's good to know if coverage is improving or regressing before digging into a long report. ...

If the report isn't roughly the same size as the pull request, then either you've been pointed to an unnecessarily long report or the pull request significantly changed the coverage of code not touched by the pull request. With filtering correctly applied, a report is only long if it's important.

As the repository grows in size, the coverage changes trend towards 0%, and eventually "improving" or "regressing" is an irrelevantly small number. It's best to just ignore it from the start, which both improves focus on items that actually matter and avoids misleading new users into thinking aggregate metrics are useful numbers.

A great example of an incorrectly filtered code coverage report can be viewed by clicking the link in the bot comment.

code coverage increase/decrease

This isn't a helpful metric.

It is valuable in the context of this repo which has established guidelines and procedures around code coverage. One of which - the coverage can't go down. And it's down, then it's up to the author to figure out what went wrong. And when it's up - the author should record the new value in the project file.

This is part of the point of my comments. When it's clear that a repository has problematic code coverage guidelines, it would be better to revisit those guidelines by teaching better ways to produce and consume code coverage information. Maintaining these policies doesn't just hurt the repository where they are used, but it suggests to the broader community that the guidelines are helpful and other teams are likely to start adopting similar processes not realizing they tend to hurt developer productivity. While the guidelines as seen here are not necessarily going to result in a buggy product, they are likely to increase product cost and timelines while reducing the ability to implement features past the MVP deliverable. In other words, the problems are related more to practical engineering efficiency and not so much towards theoretical correctness on an unbounded timeline.

@Tratcher
Copy link
Member

you've been pointed to an unnecessarily long report

The report is very long, though I do like the project level summary shown by the bot now. I agree having some granularity is more meaningful than an overall metric, and project level seems like a good compromise, it tells me where in the report to look for details.

@ghost ghost locked as resolved and limited conversation to collaborators Nov 30, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants