-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Capture and iterate on KEP template feedback #822
Comments
/assign @mattfarina |
I'd like to add that the relationship between the KEP and the enhancements issue is confusing. In my opinion, the KEP should capture the design while the issue tracker focuses on status & tracking. With that split, the "Release Signoff Checklist" should be moved to the issue tracker (a separate one for each release stage?). |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
Stale issues rot after 30d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
/remove-lifecycle rotten |
Enhancement issues opened in /remove-lifecycle frozen |
I'm planning to also add an "Open Questions" section to KEPs, which we can capture bikesheds in. |
Some of the "motivation" sections read like a mix of:
It seems like we want to encourage the first two of those and ensure they're agreed upon, the third slipping in is a bikeshed magnet and the wrong frame of mind, imho. Some of the other details probably belong in other existing portions of the template? I wonder if maybe specific headings like "Background" and "Benefits To The Project / Community" or something along those lines might be clearer (and also somewhat distinct?) I think we might want to call out more emphatically that KEPs should explain how this benefits users. IMHO that's the most important part, and isn't particularly emphasized at the moment. |
Making the test plan section of the KEP template more specific would be really helpful. I'd like to see links to testgrids filtered to the tests for a specific component in the KEP (this requires picking a feature keyword ahead of time for e2es/integration tests, etc, but shouldn't be too difficult). This makes it really easy for someone looking at a given KEP to follow the thread to see how many tests there are for the feature, how flaky they are, etc. |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
/remove-lifecycle stale |
|
one more: the cc: @saschagrunert |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
/remove-lifecycle stale |
Recently in Enhancements we've been talking about implementing changes suggested in this issue, including @liggitt's call for greater specificity in the testing plan section. To that end, we'd like to get more feedback/input from SIGs Arch and Testing about what a more detailed testing plan should include. Wondering if we might generate some ideas here? |
From the SIG-Arch meeting on October 22nd:
The test-grid link is: https://testgrid.k8s.io/sig-arch-conformance#apisnoop-conformance-gate |
Any updates here? @mattfarina do you still want to be listed as an assignee? |
/unassign @mattfarina |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-contributor-experience at kubernetes/community. |
Stale issues rot after 30d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-contributor-experience at kubernetes/community. |
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs. This bot triages issues and PRs according to the following rules:
You can:
Please send feedback to sig-contributor-experience at kubernetes/community. /close |
@k8s-triage-robot: Closing this issue. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
AIs from #703:
(@jbeda) "Biggest comment is that this is no longer really a template but rather instructions. Perhaps break out into a new doc and/or update https://github.com/kubernetes/enhancements/blob/master/keps/0001-kubernetes-enhancement-proposal-process.md?
Or merge this into the first kep?"
(@mattfarina) "How do we deal with differing release processes? For example, when kubectl is broken out into its own repo and has a release process and timing that is different from k8s/k8s?"
(@lavalamp) On filing Enhancement Issues, in addition to writing KEPs... "Honestly from the perspective of people wanting to file KEPs this seems like obscure busywork. Why don't we make a bot to file these issues?"
(@pbarker) "Questions have been raised here over adding iterative features. If a KEP is marked as 'implementable' and then a feature is added, it inherits the implementable status. Would be nice to have some structure around these changes if it makes sense to do in this revision."
(@lavalamp) "I still don't really understand why the "approvers" and "reviewers" section of KEPs exist.
If it's about people who are supposed to approve/review the KEP itself, don't we have owners files for that? KEPs are in separate directories now.
If it's about people who are going to help approve and/or review the code changes (e.g., demonstrating that you've lined up sufficient bandwidth in the schedules of busy folks to actually get the changes made), it's probably named wrong.
Actually IMO it's totally ambiguous right now and that section of the KEP could either be renamed or have a comment to make it clear what it means."
(@lavalamp) "Are KEPs design docs or requirements docs?
People have said "both" but I'm not convinced that's a good answer. There is no reason to suppose that the set of people who know what good goals are has a lot of overlap with the set of people who will be good at charting a path to achieving the goals.
I personally feel that it's reasonable to hold a vote on goals (i.e. requirements), as long as they're appropriately phrased (e.g., "is problem X an important problem for the project to solve" NOT "is changing the frobber API to interoperate with the thingy a good idea"). It is not a good idea to vote on solutions (designs), that should be handled by the right technical folks. (I think any formal specification of this distinction can be gamed, unfortunately.)"
(@mattfarina) "Could we explicitly state that a single KEP is for all maturity levels through graduation and that the graduation criteria should be described for the transition between each level. The examples below share this but some explicit direction could help clear up what @bgrant0607 noticed and I realized may be implicitly communicated."
@justaugustus "@mattfarina -- I'm wondering if that's a better fit for the front documentation on KEPs (which I'm planning to work on once this lands)?
A KEP captures details of an enhancement and is meant to be the steel thread to capture all implementation states. There should only be one KEP per enhancement.
Happy to add something if that thought isn't really captured well."
/sig pm
/assign
/milestone keps-beta
The text was updated successfully, but these errors were encountered: