Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't wait for TaskRun to be observed Running. #4773

Merged
merged 1 commit into from
Apr 20, 2022

Conversation

mattmoor
Copy link
Member

@mattmoor mattmoor commented Apr 19, 2022

As explained in detail in the associated issue, there are multiple problems with expecting the test logic to observe the TaskRuns to be Running, including them never actually existing in that state in etcd if the cluster is busy and the controller gets backed up, or is sufficiently weak.

Since observing the TaskRuns Running isn't material to testing timeouts, this removes those and simply checks that the TaskRuns reach that ultimate state.

I could have sworn I fixed something like this before, but couldn't find it.

Fixes: #4772

/kind bug

Submitter Checklist

As the author of this PR, please check off the items in this checklist:

  • Docs included if any changes are user facing
  • Tests included if any functionality added or changed
  • Follows the commit message standard
  • Meets the Tekton contributor standards (including
    functionality, content, code)
  • Release notes block below has been filled in
    (if there are no user facing changes, use release note "NONE")

Release Notes

NONE

@tekton-robot tekton-robot added release-note-none Denotes a PR that doesnt merit a release note. kind/bug Categorizes issue or PR as related to a bug. size/M Denotes a PR that changes 30-99 lines, ignoring generated files. labels Apr 19, 2022
@mattmoor mattmoor mentioned this pull request Apr 19, 2022
5 tasks
@mattmoor
Copy link
Member Author

Ok, I also just saw this hit on the PipelineRun wait, so I guess I should remove more of these:

timeout_test.go:96: Error waiting for PipelineRun pipeline-run-timeout-hfoiuuaa to be running: "pipeline-run-timeout-hfoiuuaa" already finished

As explained in detail in the associated issue, there are multiple problems with expecting the test logic to observe the `TaskRun`s to be `Running`, including them never actually existing in that state in etcd if the cluster is busy and the controller gets backed up, or is sufficiently weak.

Since observing the `TaskRun`s `Running` isn't material to testing timeouts, this removes those and simply checks that the `TaskRun`s reach that ultimate state.

These same issues apply to the `PipelineRun` as well, and I also just observed that flaking, so hopefully this addresses all of them!

Fixes: tektoncd#4772
@tekton-robot tekton-robot added size/L Denotes a PR that changes 100-499 lines, ignoring generated files. and removed size/M Denotes a PR that changes 30-99 lines, ignoring generated files. labels Apr 19, 2022
Comment on lines +94 to +96
t.Logf("Waiting for PipelineRun %s in namespace %s to be timed out", pipelineRun.Name, namespace)
if err := WaitForPipelineRunState(ctx, c, pipelineRun.Name, timeout, FailedWithReason(v1beta1.PipelineRunReasonTimedOut.String(), pipelineRun.Name), "PipelineRunTimedOut"); err != nil {
t.Errorf("Error waiting for PipelineRun %s to finish: %s", pipelineRun.Name, err)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved these checks up from below, so they execute before the taskrunList below is determined, lest we hit another race creating the taskruns 😅

🤞 this fixes these once and for all 🤞

@mattmoor
Copy link
Member Author

/test pull-tekton-pipeline-go-coverage

TLS handshake 🙃

@abayer
Copy link
Contributor

abayer commented Apr 19, 2022

/lgtm

Death to flakes!

@tekton-robot tekton-robot added the lgtm Indicates that a PR is ready to be merged. label Apr 19, 2022
@imjasonh
Copy link
Member

/lgtm

@mattmoor
Copy link
Member Author

/test pull-tekton-pipeline-integration-tests
/test pull-tekton-pipeline-build-tests

Prow seems to have lost these 🤔

@mattmoor
Copy link
Member Author

/test pull-tekton-pipeline-integration-tests

TLS timeouts

@mattmoor
Copy link
Member Author

/test pull-tekton-pipeline-integration-tests

sidecar test 🙃

@mattmoor
Copy link
Member Author

cc @afrittoli @vdemeester

@tekton-robot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: vdemeester

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@tekton-robot tekton-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Apr 20, 2022
@tekton-robot tekton-robot merged commit 5fa2b68 into tektoncd:main Apr 20, 2022
@mattmoor mattmoor deleted the dont-wait-for-running branch April 20, 2022 14:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. kind/bug Categorizes issue or PR as related to a bug. lgtm Indicates that a PR is ready to be merged. release-note-none Denotes a PR that doesnt merit a release note. size/L Denotes a PR that changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Timeout tests expect to observe all task states
5 participants