Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend e2e-tests: Add resources tests #1342

Closed
wants to merge 4 commits into from

Conversation

elikatsis
Copy link
Member

@elikatsis elikatsis commented May 16, 2019

I think we should start moving to v2.3 already by using the latest pre-release (v2.3.0-rc3). It comes with a lot of features and bug fixes.

This way:

I'm sure there are more stuff we can take advantage of. But also, we can start deprecating stuff such as the global volumes attribute, to use template-local volumes instead.

I have been using this release extensively and it seems to be working fine.

I'm making this PR to run the E2E tests. I set the deployment scripts accordingly to deploy Argo v2.3.0-rc3.
Then, if all tests are successful, I suggest to update the kubeflow repo's argo deployment, since 0.1.20 release totally covers #1250.

WDYT?

/cc @Ark-kun @vicaire @hongye-sun

If everything works this PR closes #1250.


This change is Reviewable

@elikatsis
Copy link
Member Author

/ok-to-test

@elikatsis
Copy link
Member Author

/test kubeflow-pipelines-e2e-test

@elikatsis
Copy link
Member Author

/cc @IronPan @gaoning777

@elikatsis elikatsis force-pushed the resource-e2e-test branch 4 times, most recently from b324cd1 to c47e3a3 Compare May 16, 2019 11:42
@elikatsis
Copy link
Member Author

elikatsis commented May 16, 2019

Due to pipeline-runner permissions, testing the samples fails.
I need to make some changes to the samples.
I.e. instead of creating a Secret in resourceop_basic.py I'm thinking of making it create a Pod, so that I don't change any other permissions.

[Edit: I'll keep the sample as is because it offers extra value for pvolumes usage.
I opened kubeflow/kubeflow#3294 to extend pipeline-runner's permissions]

Also, I have to cherry-pick kubeflow/kubeflow#2556 into pipelines branch (I made the PR against master back then).

@elikatsis
Copy link
Member Author

Could you take a look at kubeflow/kubeflow#3293 ?

@hongye-sun
Copy link
Contributor

@elikatsis, I'd suggest to change resourceop basic sample to demonstrate a more realistic use case like what argo sample does: create a k8s job. It doesn't seem right to expose cluster wide secret create/delete permissions for pipeline runner.

You may optionally demo pvolume usage by orchestrating it with a job spec, which will be very useful sample to answer question like #1345.

WDYT?

@elikatsis
Copy link
Member Author

@hongye-sun
[I've continued the discussion on Secret permissions in kubeflow/kubeflow#3294.]

I can add a sample with some ContainerOps and a ResourceOp creating a Job, all of them using a PipelineVolume, but it would have to be a little complex and not really an example of "basic usage".
I believe the existing sample is just right to introduce users to the concept.

* Change Argo version deployed from v2.2.0 to v2.3.0-rc3
* Add resourceop test: Executes the resourceop_basic.py
  Extend run_basic_test.py with '--params' argument.
* Add volumeop test: Executes the volumeop_sequential.py pipeline
* Change modes from RWM to RWO for VolumeOp in volumeop_sequential.py
* We cannot use some other sample with VolumeOps because GCP does not
  support the ReadWriteMany access mode. We also cannot use a
  VolumeSnapshotOp sample, because VolumeSnapshots only exist in Alpha
  GCP clusters.

Signed-off-by: Ilias Katsakioris <[email protected]>
Signed-off-by: Ilias Katsakioris <[email protected]>
@@ -56,4 +56,11 @@ ${KUBEFLOW_SRC}/scripts/kfctl.sh apply platform
${KUBEFLOW_SRC}/scripts/kfctl.sh generate k8s
${KUBEFLOW_SRC}/scripts/kfctl.sh apply k8s

pushd ks_app
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I saw kubeflow/kubeflow#3335 so I pushed the removal of these lines, along with the corresponding change in test/install-argo.sh.

Let's merge this upon the aforementioned PR's approval. Would that be ok?

Signed-off-by: Ilias Katsakioris <[email protected]>
@k8s-ci-robot k8s-ci-robot removed the lgtm label May 22, 2019
@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: hongye-sun
To fully approve this pull request, please assign additional approvers.
We suggest the following additional approver: ark-kun

If they are not already assigned, you can assign the PR to them by writing /assign @ark-kun in a comment when ready.

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@elikatsis
Copy link
Member Author

/test kubeflow-pipeline-e2e-test

@elikatsis
Copy link
Member Author

Would you mind checking the errors? It must be something regarding the deletion of the deployment.

@hongye-sun
Copy link
Contributor

/retest

@hongye-sun
Copy link
Contributor

@IronPan, do you know any known test issue recently?

@Ark-kun
Copy link
Contributor

Ark-kun commented May 24, 2019

Would you mind checking the errors? It must be something regarding the deletion of the deployment.

Looks like leaking resources have used up all the quota.

/test kubeflow-pipeline-e2e-test

@elikatsis
Copy link
Member Author

Cool, it seems it got fixed and the tests succeeded

@elikatsis
Copy link
Member Author

@Ark-kun you said leaking resources, is there something we could do to the tests to prevent it?

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 1, 2019

/test kubeflow-pipeline-sample-test

@elikatsis
Copy link
Member Author

/hold
Until #1433 is merged

@Ark-kun
Copy link
Contributor

Ark-kun commented Aug 1, 2019

/retest

@Ark-kun
Copy link
Contributor

Ark-kun commented Aug 1, 2019

Generally LGTM, but the test probably belongs to sample tests. We've recently upgraded sample_test.yaml and it's now easier to add more tests.
/lgtm
/cc @gaoning777

@k8s-ci-robot
Copy link
Contributor

@Ark-kun: GitHub didn't allow me to request PR reviews from the following users: gaoning777.

Note that only kubeflow members and repo collaborators can review this PR, and authors cannot review their own PRs.

In response to this:

Generally LGTM, but the test probably belongs to sample tests. We've recently upgraded sample_test.yaml and it's now easier to add more tests.
/lgtm
/cc @gaoning777

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@k8s-ci-robot
Copy link
Contributor

k8s-ci-robot commented Aug 1, 2019

@elikatsis: The following test failed, say /retest to rerun them all:

Test name Commit Details Rerun command
kubeflow-pipeline-e2e-test 105da34 link /test kubeflow-pipeline-e2e-test

Full PR test history. Your PR dashboard. Please help us cut down on flakes by linking to an open issue when you hit one in your PR.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. I understand the commands that are listed here.

@Ark-kun Ark-kun assigned gaoning777 and numerology and unassigned Ark-kun Aug 3, 2019
@elikatsis
Copy link
Member Author

Closing this since testing has changed a lot.

I opened #2019 to cover a part of it and more PRs will follow.

@elikatsis elikatsis closed this Sep 2, 2019
@elikatsis elikatsis deleted the resource-e2e-test branch March 9, 2020 14:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Issues on Argo 2.3
8 participants