-
Notifications
You must be signed in to change notification settings - Fork 19
Enable previews in our integration tests for compute engines #1177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable previews in our integration tests for compute engines #1177
Conversation
integration_tests/sdk/aqueduct_tests/test_functions/simple/model.py
Outdated
Show resolved
Hide resolved
assert "compute" in test_credentials, "compute section expected in test-credentials.yml" | ||
assert name in test_credentials["compute"].keys(), "%s not in test-credentials.yml." % name | ||
|
||
if "enable_previews" in test_credentials["compute"][name].keys(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we can just return "enable_previews" not in test_credentials["compute"][name].keys()
. It should be pretty clear what is being returned from the docstring.
integration_tests/sdk/conftest.py
Outdated
global_config({"engine": request.param, "lazy": lazy_config}) | ||
return request.param | ||
else: | ||
return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was actually thinking we could make a separate fixture that is autoused for this (so the user doesn't have to explicitly include engine
for it to trigger). This fixture would essentially perform this logic here, yield, and then reset the global configuration at the end.
This makes the fixture more understandable I think, since it's focus is just to handle the global config lifecycle. It also makes it so that tests that don't include engine
can benefit from this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, if we don't reset at the end, we may not be able to handle parallelized tests with multiple compute engines (eg. aqueduct_engine and k8s), since the former must have previews enabled and the second may not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By this, you mean just the global_config({"lazy"}) part? So the new fixture would just focus on lazy execution and I'd keep the logic for the engine here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mmm I was thinking everything, both the engine and config. Essentially, this fixture translates what we have in test-config.yml
to the appropriate global_config() state. I think it's better to separate all that from the engine fixture, which has to be included explicitly by the user.
@@ -325,6 +325,24 @@ def _fetch_integration_credentials(section: str, name: str) -> Dict[str, Any]: | |||
return test_credentials[section][name] | |||
|
|||
|
|||
def lazy_configuration(name: str) -> bool: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The logical jump from enable_previews
to "should have lazy execution" actually took me a while to get. I think it would be good to explain in the fixture why external compute engines need to have their executions set to lazy for previews.
I think part of the confusion was that the docstrings and name of this function attempt to explain things beyond the scope of it. I've prefer if that explanation was in the caller, and this method was focused on just checking if enable_preview
was set - probably renaming this to is_preview_enabled()
or something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool! But please address the last few polish comments :)
integration_tests/sdk/conftest.py
Outdated
@@ -132,6 +133,18 @@ def engine(request, pytestconfig): | |||
return request.param if request.param != "aqueduct_engine" else None | |||
|
|||
|
|||
@pytest.fixture(scope="function", autouse=True) | |||
def set_global_configs(engine): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
super nit: set_global_config()
sounds better to me since there's technically only one global config?
integration_tests/sdk/conftest.py
Outdated
@pytest.fixture(scope="function", autouse=True) | ||
def set_global_configs(engine): | ||
if engine != None: | ||
# We set lazy to False if previews are enabled, True otherwise. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we sort of expand this comment to not just echo the code, but provide the reasoning behind the relationship between lazy and enabled preview? I'd like to know here that we set execution to lazy when external compute is involved and previews are not enabled, because then all the non-publish stuff goes through the local engine, which wouldn't really test the external compute.
I'm ok with more words here if they explain non-intuitive concepts.
@@ -325,6 +325,20 @@ def _fetch_integration_credentials(section: str, name: str) -> Dict[str, Any]: | |||
return test_credentials[section][name] | |||
|
|||
|
|||
def is_preview_enabled(name: str) -> bool: | |||
""" | |||
Checks config to see if given compute integration should have lazy exeuction. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's not mention "lazy execution" here, it's confusing. I'm a fan of this docstring just being:
"Returns whether the given compute integration has enable_previews
set."
Describe your changes and why you are making these changes
Enable previews in our integration tests for compute engines
Related issue number (if any)
https://linear.app/aqueducthq/issue/ENG-2720/enable-previews-in-our-integration-tests-to-also-run-on-external
Loom demo (if any)
Checklist before requesting a review
python3 scripts/run_linters.py -h
for usage).run_integration_test
: Runs integration testsskip_integration_test
: Skips integration tests (Should be used when changes are ONLY documentation/UI)