Skip to content

Enable previews in our integration tests for compute engines #1177

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

hsubbaraj-spiral
Copy link
Contributor

Describe your changes and why you are making these changes

Enable previews in our integration tests for compute engines

Related issue number (if any)

https://linear.app/aqueducthq/issue/ENG-2720/enable-previews-in-our-integration-tests-to-also-run-on-external

Loom demo (if any)

Checklist before requesting a review

  • I have created a descriptive PR title. The PR title should complete the sentence "This PR...".
  • I have performed a self-review of my code.
  • I have included a small demo of the changes. For the UI, this would be a screenshot or a Loom video.
  • If this is a new feature, I have added unit tests and integration tests.
  • I have run the integration tests locally and they are passing.
  • I have run the linter script locally (See python3 scripts/run_linters.py -h for usage).
  • All features on the UI continue to work correctly.
  • Added one of the following CI labels:
    • run_integration_test: Runs integration tests
    • skip_integration_test: Skips integration tests (Should be used when changes are ONLY documentation/UI)

@hsubbaraj-spiral hsubbaraj-spiral requested a review from kenxu95 April 5, 2023 23:47
@hsubbaraj-spiral hsubbaraj-spiral added the run_integration_test Triggers integration tests label Apr 5, 2023
@hsubbaraj-spiral hsubbaraj-spiral removed the request for review from kenxu95 April 5, 2023 23:56
@hsubbaraj-spiral hsubbaraj-spiral requested a review from kenxu95 April 6, 2023 00:45
assert "compute" in test_credentials, "compute section expected in test-credentials.yml"
assert name in test_credentials["compute"].keys(), "%s not in test-credentials.yml." % name

if "enable_previews" in test_credentials["compute"][name].keys():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can just return "enable_previews" not in test_credentials["compute"][name].keys(). It should be pretty clear what is being returned from the docstring.

global_config({"engine": request.param, "lazy": lazy_config})
return request.param
else:
return None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was actually thinking we could make a separate fixture that is autoused for this (so the user doesn't have to explicitly include engine for it to trigger). This fixture would essentially perform this logic here, yield, and then reset the global configuration at the end.

This makes the fixture more understandable I think, since it's focus is just to handle the global config lifecycle. It also makes it so that tests that don't include engine can benefit from this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, if we don't reset at the end, we may not be able to handle parallelized tests with multiple compute engines (eg. aqueduct_engine and k8s), since the former must have previews enabled and the second may not.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By this, you mean just the global_config({"lazy"}) part? So the new fixture would just focus on lazy execution and I'd keep the logic for the engine here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mmm I was thinking everything, both the engine and config. Essentially, this fixture translates what we have in test-config.yml to the appropriate global_config() state. I think it's better to separate all that from the engine fixture, which has to be included explicitly by the user.

@@ -325,6 +325,24 @@ def _fetch_integration_credentials(section: str, name: str) -> Dict[str, Any]:
return test_credentials[section][name]


def lazy_configuration(name: str) -> bool:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logical jump from enable_previews to "should have lazy execution" actually took me a while to get. I think it would be good to explain in the fixture why external compute engines need to have their executions set to lazy for previews.

I think part of the confusion was that the docstrings and name of this function attempt to explain things beyond the scope of it. I've prefer if that explanation was in the caller, and this method was focused on just checking if enable_preview was set - probably renaming this to is_preview_enabled() or something.

@hsubbaraj-spiral hsubbaraj-spiral requested a review from kenxu95 April 6, 2023 22:48
Copy link
Contributor

@kenxu95 kenxu95 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool! But please address the last few polish comments :)

@@ -132,6 +133,18 @@ def engine(request, pytestconfig):
return request.param if request.param != "aqueduct_engine" else None


@pytest.fixture(scope="function", autouse=True)
def set_global_configs(engine):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

super nit: set_global_config() sounds better to me since there's technically only one global config?

@pytest.fixture(scope="function", autouse=True)
def set_global_configs(engine):
if engine != None:
# We set lazy to False if previews are enabled, True otherwise.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we sort of expand this comment to not just echo the code, but provide the reasoning behind the relationship between lazy and enabled preview? I'd like to know here that we set execution to lazy when external compute is involved and previews are not enabled, because then all the non-publish stuff goes through the local engine, which wouldn't really test the external compute.

I'm ok with more words here if they explain non-intuitive concepts.

@@ -325,6 +325,20 @@ def _fetch_integration_credentials(section: str, name: str) -> Dict[str, Any]:
return test_credentials[section][name]


def is_preview_enabled(name: str) -> bool:
"""
Checks config to see if given compute integration should have lazy exeuction.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's not mention "lazy execution" here, it's confusing. I'm a fan of this docstring just being:
"Returns whether the given compute integration has enable_previews set."

@hsubbaraj-spiral hsubbaraj-spiral merged commit 1d79fcf into main Apr 6, 2023
@hsubbaraj-spiral hsubbaraj-spiral deleted the eng-2720-enable-previews-in-our-integration-tests branch April 6, 2023 23:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
run_integration_test Triggers integration tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants