Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[v2-10-test] Backport pull_request_target removal #45527

Merged
merged 1 commit into from
Jan 12, 2025

Conversation

potiuk
Copy link
Member

@potiuk potiuk commented Jan 9, 2025

This is a bulk change that synchronizes dev/ci scripts for v2-10-test branch with main #45266 - including follow-ups.

Rather than cherry-picking relevant PRs, this one gets the latest version of the scripts from main and updates the branch with some changes to adapt them to v2-10-test (such as bringing back python 3.8 support, removing some providers checks after the bulk move of providers and making sure all tests are passing.

This is far easier than cherry-picking the changes, because for the v2-10-test we stopped cherry-picking CI changes which was deemed unnecessary (we used to do it for all previous branches) but this made it far more difficult (if not impossible) to cherry-pick individual changes.

Fortunately, the CI scripts are maintained in the way that their latest version should in principle work for a v2-* branch and hopefully after just a few adjustments we should be able to synchronize the changes from main by updating all relevant CI/DEV scripts, dockerfile images, workflows, pre-commits etc.


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

@potiuk potiuk force-pushed the apply-workflow-run-removal branch 2 times, most recently from 7fa5856 to 338d156 Compare January 10, 2025 17:31
@potiuk potiuk force-pushed the apply-workflow-run-removal branch 14 times, most recently from 6dbc0af to c6b7c5a Compare January 11, 2025 19:04
@potiuk potiuk marked this pull request as ready for review January 11, 2025 19:07
@potiuk
Copy link
Member Author

potiuk commented Jan 11, 2025

Hey here. I know this one is huge and difficult to review, but this was the easiest way I could bring the "pull_request_target" removal to v2-10-test branch.

Since we stopped cherry-picking breeze changes to v2-10-test and made a LOT of chenges in main (removing Python 3.8, moving providers, adding test_sdk and so on - cherry-picking individual commits was not an option. So I choose a different path - I copuied the latest breeze, ci_scripts, Dockerfiles and .pre-commits and adapted them tov2-10-test - mostly removing stuff that was not needed in v2-10-test (providers, charts, new api etc. etc., adding back Python 3.8).

All other changes were results of fixing the tests.

I think the easiest way to review it is two-fold:

  1. you can compare all the breeze/ci stuff with main - and see the differences (mostly removals of the things above)
  2. then you can compare "airflow" and "tests" with v2-10-test and see that they only changed to accomodate to some tests scripts changes.

I know I am asking a lot, but this is the easiest way we can remove last remnants of `pull_request_target" - which is still a potential security issue.

@potiuk potiuk force-pushed the apply-workflow-run-removal branch from c6b7c5a to 1eed775 Compare January 11, 2025 19:17
@potiuk potiuk force-pushed the apply-workflow-run-removal branch from 1eed775 to 613c3e6 Compare January 11, 2025 20:27
Copy link
Member

@gopidesupavan gopidesupavan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Jarek, Nice one. seems some provider changes pulled, i think thats okay?

@potiuk potiuk force-pushed the apply-workflow-run-removal branch from 613c3e6 to 58ccdd7 Compare January 11, 2025 21:30
@potiuk
Copy link
Member Author

potiuk commented Jan 11, 2025

Yeah - some test failure fixes needed :(

@potiuk potiuk force-pushed the apply-workflow-run-removal branch from 58ccdd7 to f803a6a Compare January 12, 2025 08:14
This is a bulk change that synchronizes dev/ci scripts for v2-10-test
branch with main #45266 - including follow-ups.

Rather than cherry-picking relevant PRs, this one gets the
latest version of the scripts from main and updates the branch with
some changes to adapt them to v2-10-test (such as bringing back
python 3.8 support, removing some providers checks after the
bulk move of providers and making sure all tests are passing.

This is far easier than cherry-picking the changes, because for
the v2-10-test we stopped cherry-picking CI changes which was
deemed unnecessary (we used to do it for all previous branches)
but this made it far more difficult (if not impossible) to
cherry-pick individual changes.

Fortunately, the CI scripts are maintained in the way that their
latest version **should** in principle work for a v2-* branch and
hopefully after just a few adjustments we should be able to
synchronize the changes from main by updating all relevant
CI/DEV scripts, dockerfile images, workflows, pre-commits etc.

Add actions in codeql workflows to scan github workflow actions (#45534)

* add actions in codeql workflows to scan github workflow actions

* add actions in codeql workflows to scan github workflow actions

CodeQL scanning can run always on all code (#45541)

The CodeQL scannig is fast and having custom configuration to
select which scanning to run should be run makes it unnecessarily
complex

We can just run all CodeQL scans always.

This has been suggested by actions codeql scan itself.

Add explicit permissions for all workflow-run workflows (#45548)

Those workflows inherit permissions from the calling workflows
but it's good to add explicit permissions to indicate what is
needed and in case we will also use the workflows for other purposes
in the future - default permissions for older repos might be
write so it's best to be explicit about the permissions.

Found by CodeQL scanning

Remove contents: write permission from generate-constraints (#45558)

The write permission cannot be set for PRs from forks in the
call workflow - so we have to come back to implicit permissions
and make explicit permissions passing a bit differently.

(cherry picked from commit ae32ebc)

Bump trove-classifiers from 2025.1.7.14 to 2025.1.10.15 (#45561)

Bumps [trove-classifiers](https://github.com/pypa/trove-classifiers) from 2025.1.7.14 to 2025.1.10.15.
- [Release notes](https://github.com/pypa/trove-classifiers/releases)
- [Commits](pypa/trove-classifiers@2025.1.7.14...2025.1.10.15)

---
updated-dependencies:
- dependency-name: trove-classifiers
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
(cherry picked from commit f3fd262)

Add optional --image-file-dir to store loaded files elsewhere (#45564)

While backorting the "pull_request_target" removal to v2-10-test
branches it turned out that there is not enough disk space
on Public runner to load all 5 images and keep the file dump at
the same time in the same filesystem. This PR allows to choose
where the load/save files will be stored and in the github
runner environment we store the files in "/mnt" wnich is a separate
folder with 40GB free.

(cherry picked from commit 6628049)

Fix --from-pr feature for image load and stabilize help

This is a follow-up after #45564 - it fixes the `--from-pr` and
`--from-run` to work (it was failing with file does not exist).

Also found out that gettempdir might return different directory
depending on which is your designated tmp directory (for example
in MacOS this is is a longer path in /var/.....) - so we have
to force the default during help generation to always return
"/tmp" so that the --help images do not change depending on which
system you are and what your tmp directory is.
@potiuk potiuk force-pushed the apply-workflow-run-removal branch from f803a6a to b00fdff Compare January 12, 2025 11:03
@potiuk
Copy link
Member Author

potiuk commented Jan 12, 2025

All problems solved. I also applied latest version of sphinx-theme limit and compared produced .whl files for airflow and the differences are pretty much as expected:

diff ./old ./new -r

diff -r ./old/airflow/api/common/mark_tasks.py ./new/airflow/api/common/mark_tasks.py
413a414,417
>
>     # Mark all task instances of the dag run to success - except for teardown as they need to complete work.
>     normal_tasks = [task for task in dag.tasks if not task.is_teardown]
>
415c419
<     if commit:
---
>     if commit and len(normal_tasks) == len(dag.tasks):
418,419c422
<     # Mark all task instances of the dag run to success.
<     for task in dag.tasks:
---
>     for task in normal_tasks:
422c425
<         tasks=dag.tasks,
---
>         tasks=normal_tasks,
469,472d471
<     # Mark the dag run to failed.
<     if commit:
<         _set_dag_run_state(dag.dag_id, run_id, DagRunState.FAILED, session)
<
481c480
<     tis = session.scalars(
---
>     running_tis: list[TaskInstance] = session.scalars(
488c487
<     )
---
>     ).all()
490c489,490
<     task_ids_of_running_tis = [task_instance.task_id for task_instance in tis]
---
>     # Do not kill teardown tasks
>     task_ids_of_running_tis = [ti.task_id for ti in running_tis if not dag.task_dict[ti.task_id].is_teardown]
492c492
<     tasks = []
---
>     running_tasks = []
496c496
<             tasks.append(task)
---
>             running_tasks.append(task)
499c499
<     tis = session.scalars(
---
>     pending_tis: list[TaskInstance] = session.scalars(
512a513,515
>     # Do not skip teardown tasks
>     pending_normal_tis = [ti for ti in pending_tis if not dag.task_dict[ti.task_id].is_teardown]
>
514c517
<         for ti in tis:
---
>         for ti in pending_normal_tis:
517,518c520,525
<     return tis + set_state(
<         tasks=tasks,
---
>         # Mark the dag run to failed if there is no pending teardown (else this would not be scheduled later).
>         if not any(dag.task_dict[ti.task_id].is_teardown for ti in (running_tis + pending_tis)):
>             _set_dag_run_state(dag.dag_id, run_id, DagRunState.FAILED, session)
>
>     return pending_normal_tis + set_state(
>         tasks=running_tasks,
diff -r ./old/airflow/api_connexion/openapi/v1.yaml ./new/airflow/api_connexion/openapi/v1.yaml
5735a5736
>         format: path
diff -r ./old/airflow/cli/cli_config.py ./new/airflow/cli/cli_config.py
67c67
<         """Override error and use print_instead of print_usage."""
---
>         """Override error and use print_help instead of print_usage."""
diff -r ./old/airflow/datasets/metadata.py ./new/airflow/datasets/metadata.py
19a20
> import warnings
40a42,50
>         if isinstance(target, str):
>             warnings.warn(
>                 (
>                     "Accessing outlet_events using string is deprecated and will be removed in Airflow 3. "
>                     "Please use the Dataset or DatasetAlias object (renamed as Asset and AssetAlias in Airflow 3) directly"
>                 ),
>                 DeprecationWarning,
>                 stacklevel=2,
>             )
45a56,63
>             warnings.warn(
>                 (
>                     "Emitting dataset events using string is deprecated and will be removed in Airflow 3. "
>                     "Please use the Dataset object (renamed as Asset in Airflow 3) directly"
>                 ),
>                 DeprecationWarning,
>                 stacklevel=2,
>             )
diff -r ./old/airflow/executors/executor_loader.py ./new/airflow/executors/executor_loader.py
340c340
<         if engine.dialect.name == "sqlite":
---
>         if engine and engine.dialect.name == "sqlite":
diff -r ./old/airflow/git_version ./new/airflow/git_version
1c1
< .release:c083e456fa02c6cb32cdbe0c9ed3c3b2380beccd
\ No newline at end of file
---
> .release:a9fe36219cd537af06708c9ed2efba86e3449f81
\ No newline at end of file
diff -r ./old/airflow/models/baseoperator.py ./new/airflow/models/baseoperator.py
968d967
<         validate_key(task_id)
973a973,975
>
>         validate_key(self.task_id)
>
diff -r ./old/airflow/models/mappedoperator.py ./new/airflow/models/mappedoperator.py
832a833,834
>             op.downstream_task_ids = self.downstream_task_ids
>             op.upstream_task_ids = self.upstream_task_ids
diff -r ./old/airflow/models/skipmixin.py ./new/airflow/models/skipmixin.py
164,165d163
<         SkipMixin._set_state_to_skipped(dag_run, task_ids_list, session)
<         session.commit()
166a165,169
>         # The following could be applied only for non-mapped tasks
>         if map_index == -1:
>             SkipMixin._set_state_to_skipped(dag_run, task_ids_list, session)
>             session.commit()
>
179a183
>     @staticmethod
181d184
<         self,
diff -r ./old/airflow/models/taskinstance.py ./new/airflow/models/taskinstance.py
28a29
> import traceback
249c250
<         TaskInstance.save_to_db(ti=ti, session=session)
---
>         TaskInstance.save_to_db(ti=ti, session=session, refresh_dag=False)
1243c1244
<         TaskInstance.save_to_db(failure_context["ti"], session)
---
>         TaskInstance.save_to_db(task_instance, session)
3093a3095
>             self.log.error("Stacktrace: \n%s", "".join(traceback.format_stack()))
3396c3398,3402
<     def save_to_db(ti: TaskInstance | TaskInstancePydantic, session: Session = NEW_SESSION):
---
>     def save_to_db(
>         ti: TaskInstance | TaskInstancePydantic, session: Session = NEW_SESSION, refresh_dag: bool = True
>     ):
>         if refresh_dag and isinstance(ti, TaskInstance):
>             ti.get_dagrun().refresh_from_db()
diff -r ./old/airflow/providers_manager.py ./new/airflow/providers_manager.py
534d533
<     @provider_info_cache("hook_lineage_writers")
diff -r ./old/airflow/reproducible_build.yaml ./new/airflow/reproducible_build.yaml
1,2c1,2
< release-notes-hash: 0867869dba7304e7ead28dd0800c5c4b
< source-date-epoch: 1733822937
---
> release-notes-hash: 7be47e2ddbbe1bfbd0d3f572d2b7800a
> source-date-epoch: 1736532824
diff -r ./old/airflow/sensors/base.py ./new/airflow/sensors/base.py
109c109,112
<             TaskReschedule.try_number == try_number,
---
>             # If the first try's record was not saved due to the Exception occurred and the following
>             # transaction rollback, the next available attempt should be taken
>             # to prevent falling in the endless rescheduling
>             TaskReschedule.try_number >= try_number,
256c259
<             # first execution of the task, or the first execution after the task was cleared.)
---
>             # first execution of the task, or the first execution after the task was cleared).
diff -r ./old/airflow/ti_deps/deps/not_previously_skipped_dep.py ./new/airflow/ti_deps/deps/not_previously_skipped_dep.py
21a22
> from airflow.utils.db import LazySelectSequence
41d41
<             SkipMixin,
52,55c52,54
<             if isinstance(parent, SkipMixin):
<                 if parent.task_id not in finished_task_ids:
<                     # This can happen if the parent task has not yet run.
<                     continue
---
>             if parent.task_id not in finished_task_ids:
>                 # This can happen if the parent task has not yet run.
>                 continue
57c56,58
<                 prev_result = ti.xcom_pull(task_ids=parent.task_id, key=XCOM_SKIPMIXIN_KEY, session=session)
---
>             prev_result = ti.xcom_pull(
>                 task_ids=parent.task_id, key=XCOM_SKIPMIXIN_KEY, session=session, map_indexes=ti.map_index
>             )
59,61c60,61
<                 if prev_result is None:
<                     # This can happen if the parent task has not yet run.
<                     continue
---
>             if isinstance(prev_result, LazySelectSequence):
>                 prev_result = next(iter(prev_result))
63,75c63,65
<                 should_skip = False
<                 if (
<                     XCOM_SKIPMIXIN_FOLLOWED in prev_result
<                     and ti.task_id not in prev_result[XCOM_SKIPMIXIN_FOLLOWED]
<                 ):
<                     # Skip any tasks that are not in "followed"
<                     should_skip = True
<                 elif (
<                     XCOM_SKIPMIXIN_SKIPPED in prev_result
<                     and ti.task_id in prev_result[XCOM_SKIPMIXIN_SKIPPED]
<                 ):
<                     # Skip any tasks that are in "skipped"
<                     should_skip = True
---
>             if prev_result is None:
>                 # This can happen if the parent task has not yet run.
>                 continue
77,92c67,84
<                 if should_skip:
<                     # If the parent SkipMixin has run, and the XCom result stored indicates this
<                     # ti should be skipped, set ti.state to SKIPPED and fail the rule so that the
<                     # ti does not execute.
<                     if dep_context.wait_for_past_depends_before_skipping:
<                         past_depends_met = ti.xcom_pull(
<                             task_ids=ti.task_id, key=PAST_DEPENDS_MET, session=session, default=False
<                         )
<                         if not past_depends_met:
<                             yield self._failing_status(
<                                 reason=("Task should be skipped but the past depends are not met")
<                             )
<                             return
<                     ti.set_state(TaskInstanceState.SKIPPED, session)
<                     yield self._failing_status(
<                         reason=f"Skipping because of previous XCom result from parent task {parent.task_id}"
---
>             should_skip = False
>             if (
>                 XCOM_SKIPMIXIN_FOLLOWED in prev_result
>                 and ti.task_id not in prev_result[XCOM_SKIPMIXIN_FOLLOWED]
>             ):
>                 # Skip any tasks that are not in "followed"
>                 should_skip = True
>             elif XCOM_SKIPMIXIN_SKIPPED in prev_result and ti.task_id in prev_result[XCOM_SKIPMIXIN_SKIPPED]:
>                 # Skip any tasks that are in "skipped"
>                 should_skip = True
>
>             if should_skip:
>                 # If the parent SkipMixin has run, and the XCom result stored indicates this
>                 # ti should be skipped, set ti.state to SKIPPED and fail the rule so that the
>                 # ti does not execute.
>                 if dep_context.wait_for_past_depends_before_skipping:
>                     past_depends_met = ti.xcom_pull(
>                         task_ids=ti.task_id, key=PAST_DEPENDS_MET, session=session, default=False
94c86,95
<                     return
---
>                     if not past_depends_met:
>                         yield self._failing_status(
>                             reason="Task should be skipped but the past depends are not met"
>                         )
>                         return
>                 ti.set_state(TaskInstanceState.SKIPPED, session)
>                 yield self._failing_status(
>                     reason=f"Skipping because of previous XCom result from parent task {parent.task_id}"
>                 )
>                 return
diff -r ./old/airflow/ti_deps/deps/trigger_rule_dep.py ./new/airflow/ti_deps/deps/trigger_rule_dep.py
29a30
> from airflow.utils.task_group import MappedTaskGroup
66,67c67
<         :param ti: the ti that we want to calculate deps for
<         :param finished_tis: all the finished tasks of the dag_run
---
>         :param finished_upstreams: all the finished upstreams of the dag_run
145a146,158
>         def _iter_expansion_dependencies(task_group: MappedTaskGroup) -> Iterator[str]:
>             from airflow.models.mappedoperator import MappedOperator
>
>             if isinstance(ti.task, MappedOperator):
>                 for op in ti.task.iter_mapped_dependencies():
>                     yield op.task_id
>             if task_group and task_group.iter_mapped_task_groups():
>                 yield from (
>                     op.task_id
>                     for tg in task_group.iter_mapped_task_groups()
>                     for op in tg.iter_mapped_dependencies()
>                 )
>
158a172,178
>             if isinstance(ti.task.task_group, MappedTaskGroup):
>                 is_fast_triggered = ti.task.trigger_rule in (TR.ONE_SUCCESS, TR.ONE_FAILED, TR.ONE_DONE)
>                 if is_fast_triggered and upstream_id not in set(
>                     _iter_expansion_dependencies(task_group=ti.task.task_group)
>                 ):
>                     return None
>
220c240
<                     yield (TaskInstance.task_id == upstream_id)
---
>                     yield TaskInstance.task_id == upstream_id
240c260
<             Evaluate whether ``ti``'s trigger rule was met.
---
>             Evaluate whether ``ti``'s trigger rule was met as part of the setup constraint.
242,244c262
<             :param ti: Task instance to evaluate the trigger rule of.
<             :param dep_context: The current dependency context.
<             :param session: Database session.
---
>             :param relevant_setups: Relevant setups for the current task instance.
330,336c348
<             """
<             Evaluate whether ``ti``'s trigger rule was met.
<
<             :param ti: Task instance to evaluate the trigger rule of.
<             :param dep_context: The current dependency context.
<             :param session: Database session.
<             """
---
>             """Evaluate whether ``ti``'s trigger rule in direct relatives was met."""
436c448
<                             reason=("Task should be skipped but the past depends are not met")
---
>                             reason="Task should be skipped but the past depends are not met"
diff -r ./old/airflow/utils/context.py ./new/airflow/utils/context.py
361a362
>         "conf": [],
diff -r ./old/airflow/utils/sqlalchemy.py ./new/airflow/utils/sqlalchemy.py
112a113,114
>     should_evaluate_none = True
>
Only in ./old/airflow/www/static/dist: clusterActivity.2ecf4759427048c07368.js
Only in ./old/airflow/www/static/dist: clusterActivity.2ecf4759427048c07368.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: clusterActivity.fff5d3527b4c5eedb340.js
Only in ./new/airflow/www/static/dist: clusterActivity.fff5d3527b4c5eedb340.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: dags.2b495ee52ff9e3b5160e.css
Only in ./old/airflow/www/static/dist: dags.2b495ee52ff9e3b5160e.js
Only in ./old/airflow/www/static/dist: dags.2b495ee52ff9e3b5160e.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: dags.4cb7043334f0e3173c4c.css
Only in ./new/airflow/www/static/dist: dags.4cb7043334f0e3173c4c.js
Only in ./new/airflow/www/static/dist: dags.4cb7043334f0e3173c4c.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: datasets.0bc892295c97e7bfe58d.js
Only in ./new/airflow/www/static/dist: datasets.0bc892295c97e7bfe58d.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: datasets.9af23983e71a1ebcbd80.js
Only in ./old/airflow/www/static/dist: datasets.9af23983e71a1ebcbd80.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: grid.70939cd423edfd7d6e08.js
Only in ./old/airflow/www/static/dist: grid.70939cd423edfd7d6e08.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: grid.9dfc288c631a1f964c7a.js
Only in ./new/airflow/www/static/dist: grid.9dfc288c631a1f964c7a.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: main.8461584ab30f513901c2.css
Only in ./old/airflow/www/static/dist: main.8461584ab30f513901c2.js
Only in ./old/airflow/www/static/dist: main.8461584ab30f513901c2.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: main.fb487bd34c7cd20f02bc.css
Only in ./new/airflow/www/static/dist: main.fb487bd34c7cd20f02bc.js
Only in ./new/airflow/www/static/dist: main.fb487bd34c7cd20f02bc.js.LICENSE.txt
diff -r ./old/airflow/www/static/dist/manifest.json ./new/airflow/www/static/dist/manifest.json
9,10c9,10
<   "dags.css": "dags.2b495ee52ff9e3b5160e.css",
<   "dags.js": "dags.2b495ee52ff9e3b5160e.js",
---
>   "dags.css": "dags.4cb7043334f0e3173c4c.css",
>   "dags.js": "dags.4cb7043334f0e3173c4c.js",
17,18c17,18
<   "main.css": "main.8461584ab30f513901c2.css",
<   "main.js": "main.8461584ab30f513901c2.js",
---
>   "main.css": "main.fb487bd34c7cd20f02bc.css",
>   "main.js": "main.fb487bd34c7cd20f02bc.js",
25c25
<   "taskInstances.js": "taskInstances.a65435400ad9c5e928c1.js",
---
>   "taskInstances.js": "taskInstances.7a19b383b09d370fe8a0.js",
28,31c28,31
<   "grid.js": "grid.70939cd423edfd7d6e08.js",
<   "clusterActivity.js": "clusterActivity.2ecf4759427048c07368.js",
<   "datasets.js": "datasets.9af23983e71a1ebcbd80.js",
<   "trigger.js": "trigger.d972e04a6a32368ffc7e.js",
---
>   "grid.js": "grid.9dfc288c631a1f964c7a.js",
>   "clusterActivity.js": "clusterActivity.fff5d3527b4c5eedb340.js",
>   "datasets.js": "datasets.0bc892295c97e7bfe58d.js",
>   "trigger.js": "trigger.cef24b4966646f363d5a.js",
Only in ./new/airflow/www/static/dist: taskInstances.7a19b383b09d370fe8a0.js
Only in ./new/airflow/www/static/dist: taskInstances.7a19b383b09d370fe8a0.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: taskInstances.a65435400ad9c5e928c1.js
Only in ./old/airflow/www/static/dist: taskInstances.a65435400ad9c5e928c1.js.LICENSE.txt
Only in ./new/airflow/www/static/dist: trigger.cef24b4966646f363d5a.js
Only in ./new/airflow/www/static/dist: trigger.cef24b4966646f363d5a.js.LICENSE.txt
Only in ./old/airflow/www/static/dist: trigger.d972e04a6a32368ffc7e.js
Only in ./old/airflow/www/static/dist: trigger.d972e04a6a32368ffc7e.js.LICENSE.txt
diff -r ./old/airflow/www/static/js/api/useTaskXcom.ts ./new/airflow/www/static/js/api/useTaskXcom.ts
60,67c60,69
<     () =>
<       axios.get<AxiosResponse, API.XCom>(
<         getMetaValue("task_xcom_entry_api")
<           .replace("_DAG_RUN_ID_", dagRunId)
<           .replace("_TASK_ID_", taskId)
<           .replace("_XCOM_KEY_", xcomKey),
<         { params: { map_index: mapIndex, stringify: false } }
<       ),
---
>     () => {
>       const taskXcomEntryApiUrl = getMetaValue("task_xcom_entry_api")
>         .replace("_DAG_RUN_ID_", dagRunId)
>         .replace("_TASK_ID_", taskId)
>         .replace("_XCOM_KEY_", encodeURIComponent(xcomKey));
>
>       return axios.get<AxiosResponse, API.XCom>(taskXcomEntryApiUrl, {
>         params: { map_index: mapIndex, stringify: false },
>       });
>     },
diff -r ./old/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx ./new/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx
60,61c60,67
<     const urlRegex = /^(https?:)/i;
<     return urlRegex.test(url);
---
>     const path = new URL(url, "http://localhost");
>     // Allow Absolute/Relative URL and prevent javascript:() from executing when passed as path.
>     // Example - `javascript:alert("Hi");`. Protocol for absolute and relative urls will either be `http:`/`https:`.
>     // Where as for javascript it will be `javascript:`.
>     if (path.protocol === "http:" || path.protocol === "https:") {
>       return true; // Absolute/Relative URLs are allowed
>     }
>     return false;
diff -r ./old/airflow/www/static/js/main.js ./new/airflow/www/static/js/main.js
288a289,292
>
>   // Turn off autocomplete for login form
>   $("#username:input")[0].autocomplete = "off";
>   $("#password:input")[0].autocomplete = "off";
diff -r ./old/airflow/www/static/js/trigger.js ./new/airflow/www/static/js/trigger.js
62,63d61
<       } else if (elements[i].value.length === 0) {
<         params[keyName] = null;
83a82,83
>       } else if (elements[i].value.length === 0) {
>         params[keyName] = null;
diff -r ./old/airflow/www/templates/airflow/trigger.html ./new/airflow/www/templates/airflow/trigger.html
123c123,125
<         {{- form_details.value | tojson() -}}
---
>         {%- if form_details.value %}
>           {{- form_details.value | tojson() -}}
>         {% endif -%}
diff -r ./old/airflow/www/views.py ./new/airflow/www/views.py
5358a5359
>         "rendered_map_index",
diff -r ./old/airflow/www/yarn.lock ./new/airflow/www/yarn.lock
9021,9023c9021,9023
<   version "3.3.7"
<   resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.7.tgz#d0c301a691bc8d54efa0a2226ccf3fe2fd656bd8"
<   integrity sha512-eSRppjcPIatRIMC1U6UngP8XFcz8MQWGQdt1MTBQ7NaAmvXDfvNxbvWV3x2y6CdEUciCSsDHDQZbhYaB8QEo2g==
---
>   version "3.3.8"
>   resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.8.tgz#b1be3030bee36aaff18bacb375e5cce521684baf"
>   integrity sha512-WNLf5Sd8oZxOm+TzppcYk8gVOgP+l58xNy58D0nbUnOxOWRWvlcCV4kUF7ltmI6PsrLl/BgKEyS4mqsGChFN0w==
diff -r ./old/apache_airflow-2.10.4.dist-info/METADATA ./new/apache_airflow-2.10.4.dist-info/METADATA
1c1
< Metadata-Version: 2.3
---
> Metadata-Version: 2.4
14c14,17
< Project-URL: Twitter, https://twitter.com/ApacheAirflow
---
> Project-URL: X, https://x.com/ApacheAirflow
> Project-URL: LinkedIn, https://www.linkedin.com/company/apache-airflow/
> Project-URL: Mastodon, https://fosstodon.org/@airflow
> Project-URL: Bluesky, https://bsky.app/profile/apache-airflow.bsky.social
17a21,39
> License-File: 3rd-party-licenses/LICENSE-bootstrap.txt
> License-File: 3rd-party-licenses/LICENSE-bootstrap3-typeahead.txt
> License-File: 3rd-party-licenses/LICENSE-d3-shape.txt
> License-File: 3rd-party-licenses/LICENSE-d3-tip.txt
> License-File: 3rd-party-licenses/LICENSE-d3js.txt
> License-File: 3rd-party-licenses/LICENSE-dagre-d3.txt
> License-File: 3rd-party-licenses/LICENSE-datatables.txt
> License-File: 3rd-party-licenses/LICENSE-elasticmock.txt
> License-File: 3rd-party-licenses/LICENSE-eonasdan-bootstrap-datetimepicker.txt
> License-File: 3rd-party-licenses/LICENSE-flask-kerberos.txt
> License-File: 3rd-party-licenses/LICENSE-hue.txt
> License-File: 3rd-party-licenses/LICENSE-jqclock.txt
> License-File: 3rd-party-licenses/LICENSE-jquery.txt
> License-File: 3rd-party-licenses/LICENSE-moment.txt
> License-File: 3rd-party-licenses/LICENSE-normalize.txt
> License-File: 3rd-party-licenses/LICENSE-pytest-capture-warnings.txt
> License-File: 3rd-party-licenses/LICENSE-reproducible.txt
> License-File: 3rd-party-licenses/LICENSES-ui.txt
> License-File: LICENSE
610c632
< Requires-Dist: sphinx-airflow-theme>=0.0.12; extra == 'devel-ci'
---
> Requires-Dist: sphinx-airflow-theme>=0.0.12,<0.1.0; extra == 'devel-ci'
diff -r ./old/apache_airflow-2.10.4.dist-info/RECORD ./new/apache_airflow-2.10.4.dist-info/RECORD
8c8
< airflow/git_version,sha256=6E3ZBkweW8VFYORN0DNUo8SZM3cAFT0_U0GDmbh1IqE,49
---
> airflow/git_version,sha256=rEn9yIg5n9gUvutSQdG_ngYFd65U0CLp7im84nC_Myo,49
14c14
< airflow/providers_manager.py,sha256=YOfMooQ4scNetaIF0QjutJF8E-hUnRIazU4EGalU9G4,60758
---
> airflow/providers_manager.py,sha256=MvLoakmAbgmrx1uhd-6FFwgW3HCVzDHRrPeakyp4aCU,60709
16c16
< airflow/reproducible_build.yaml,sha256=ZRd7HP66q8TdyVowicdDmugQlzopFTwZcRCYc6m_RdY,83
---
> airflow/reproducible_build.yaml,sha256=x1Y9ilxTlN-x0qa3BFSKRVoTPjL9N7U5rOgykhIDjY4,83
41c41
< airflow/api/common/mark_tasks.py,sha256=SWbzRsKPbt7GUjz1Gcy3bbSTgSfSBjEzRMNyfdDhjlQ,21676
---
> airflow/api/common/mark_tasks.py,sha256=fnk9ufUA6Z8Dh-o_-CuipyD3q18wKKAg8l8Pb0UpLgA,22322
85c85
< airflow/api_connexion/openapi/v1.yaml,sha256=9VycODYcWWFeIxs7ORfgoUwbRHZfT63nPruAalePP0I,185018
---
> airflow/api_connexion/openapi/v1.yaml,sha256=jNu8ahNGZQ3Qcpy2gykobmLC2oqUb-Ks0dKt5abMv1Q,185039
145c145
< airflow/cli/cli_config.py,sha256=GZr5cRH2QriI6igFDDqcPFCtSVyc5MbMukSO2D1arzU,74368
---
> airflow/cli/cli_config.py,sha256=dgktdDrK6ZKOSYxp-7P-KveRlstElP1ffJmeIKJIRF0,74373
198c198
< airflow/datasets/metadata.py,sha256=yUj3_yKabKrthkRA35oQHlmQk0Of0HdELY3jpFkBWrE,1503
---
> airflow/datasets/metadata.py,sha256=rWgsJn1m6AGE9K1fgEUWt1CBe5gPsSa90cHyYK3S0Gk,2276
282c282
< airflow/executors/executor_loader.py,sha256=YrbbpCMPOdzc6cNCxWS_3CD1YlZzsmCEPI7KFBZJBKw,15858
---
> airflow/executors/executor_loader.py,sha256=ZjuG9Gt5kXGSu1C7wuDEChjx5baPG_j6rvt2ew6pQNA,15869
497c497
< airflow/models/baseoperator.py,sha256=MRHmOJYNOFFQrvRxMwSPclXEiOLhgKco6bLBSHLvXRA,84364
---
> airflow/models/baseoperator.py,sha256=oJTH5qcr1BsqxzbXjGyQQhT8eJeobh5iZrTlCM98k4c,84371
513c513
< airflow/models/mappedoperator.py,sha256=qUjnblFhnJkDDUJ6TCGVGAIRy_TvTb-b7peeylEE6TU,36299
---
> airflow/models/mappedoperator.py,sha256=XWQM-eLsm-NLC1yC9vcaJY0NwY-8CY72ounznSRXPbA,36419
519c519
< airflow/models/skipmixin.py,sha256=rg0ME6-44vZabNCroMY5NfxKVyIVOIbhbWgYgIMlIu4,11195
---
> airflow/models/skipmixin.py,sha256=e9mSCLo5tCJRwKr9Qhi0tdek-0hFsr3PoeIougauHnU,11303
522c522
< airflow/models/taskinstance.py,sha256=wS61gKGC15ybUAUaBHkgBeag3Nfoq9X14CXNAcqTfVM,164929
---
> airflow/models/taskinstance.py,sha256=ZanEQ7x-bTR2bs60LzYEJWgmNhdbL69wZnEv3jfUkp8,165182
561c561
< airflow/sensors/base.py,sha256=YjzOnEWNiPIkFpecL33Hq_B8B3SBpwxuGtpztb6vVpM,19422
---
> airflow/sensors/base.py,sha256=6OKk8O9P3G2eevTESzErCHqGOP8KGfGF5POM6UgudVc,19664
618c618
< airflow/ti_deps/deps/not_previously_skipped_dep.py,sha256=naCUyduB7LovCt8KaM2xv2cOeztzvzSOnGXZKI6nj60,3905
---
> airflow/ti_deps/deps/not_previously_skipped_dep.py,sha256=rczgk5AvMDX9Hmq2nKu5tLa4Zjj43xVMX-8wKz0z2wE,3853
625c625
< airflow/ti_deps/deps/trigger_rule_dep.py,sha256=wN_QwBATJFoJbGWxEKhWprU-VdLhmwUIE6erIFmEuRA,27306
---
> airflow/ti_deps/deps/trigger_rule_dep.py,sha256=4fNNN3lW86tLqvGD22B0CxPNHw33xmjAkgZInZQUcsU,28001
652c652
< airflow/utils/context.py,sha256=YWHQchFhHxJe5Qi8tXzdlM5c94q04c9AGAErmE2gSmo,16795
---
> airflow/utils/context.py,sha256=v59xFD7N1QQiV52SoPmTLkMmR1_sJhj-5fb2rZEG8Y4,16815
693c693
< airflow/utils/sqlalchemy.py,sha256=VSjv1HclQSsC5oajvBMVJ95hqkWVIzXvOkKoPu-H-XQ,18499
---
> airflow/utils/sqlalchemy.py,sha256=BvF8UWAYTe3ZdUmBouXrMZ8IRBNeaD5sDFdtEFDvDOg,18532
749c749
< airflow/www/views.py,sha256=psdMP4jCt7VAipAP6DdAvhkdbybUCR89ZlGc0Q-FcvM,226234
---
> airflow/www/views.py,sha256=3Vuikb3pvD8R_xURIqcmxNGhpZHcD_F70ILGt_dVCFk,226264
752c752
< airflow/www/yarn.lock,sha256=xqaLeeDPyrio6vkva3mejgYeIEYUE8fLAm5j1QrfZDY,558144
---
> airflow/www/yarn.lock,sha256=tre3GeiFoWlf-eBw91q3cjwG465l5TXnzcVezBlIrB4,558144
801,802c801,802
< airflow/www/static/dist/clusterActivity.2ecf4759427048c07368.js,sha256=Y1qL_8VihWr3X0H97W8Mye1oNa-_JmE5lCjdlpZKVMY,2998266
< airflow/www/static/dist/clusterActivity.2ecf4759427048c07368.js.LICENSE.txt,sha256=8IF0e7cOf2GdyUFCB0faQVO0wWE3K69GmL9yWX_ouLw,3335
---
> airflow/www/static/dist/clusterActivity.fff5d3527b4c5eedb340.js,sha256=j1FzBv7oRMbZ4rYil1m1vLjo2tT6D-jYwk2r7Zi7Tn4,2998266
> airflow/www/static/dist/clusterActivity.fff5d3527b4c5eedb340.js.LICENSE.txt,sha256=8IF0e7cOf2GdyUFCB0faQVO0wWE3K69GmL9yWX_ouLw,3335
822,826c822,826
< airflow/www/static/dist/dags.2b495ee52ff9e3b5160e.css,sha256=K0nVsWcO_k4EvzAkoH6KBbouNA6_wymJXYhMdwQSSU0,2728
< airflow/www/static/dist/dags.2b495ee52ff9e3b5160e.js,sha256=INoxfCJ-XrElFrn9EuKJ9wDKWNa_F21xzCoU0JaC8oA,94990
< airflow/www/static/dist/dags.2b495ee52ff9e3b5160e.js.LICENSE.txt,sha256=AXbnnahJ1YVvF7cQaDFWtahoDOz70YRel1q39jKviR8,1384
< airflow/www/static/dist/datasets.9af23983e71a1ebcbd80.js,sha256=mOWNxUu98wL-ZbOzbgIWDiFDMlkaxC6HEPW-6hsPkBg,2508113
< airflow/www/static/dist/datasets.9af23983e71a1ebcbd80.js.LICENSE.txt,sha256=SmF_cYmqkr47iujpTjFXSkaqdVDp9pTuQDbPN-GKUZc,4280
---
> airflow/www/static/dist/dags.4cb7043334f0e3173c4c.css,sha256=K0nVsWcO_k4EvzAkoH6KBbouNA6_wymJXYhMdwQSSU0,2728
> airflow/www/static/dist/dags.4cb7043334f0e3173c4c.js,sha256=ocPyMPpTONv5Jm0WCW_CXt3ZwQhnSOR9131N33XLEtg,95076
> airflow/www/static/dist/dags.4cb7043334f0e3173c4c.js.LICENSE.txt,sha256=AXbnnahJ1YVvF7cQaDFWtahoDOz70YRel1q39jKviR8,1384
> airflow/www/static/dist/datasets.0bc892295c97e7bfe58d.js,sha256=g-qNenzwqj9S0s49SkrH9r4-4wl3HXYUxvYgiiefN-A,2508113
> airflow/www/static/dist/datasets.0bc892295c97e7bfe58d.js.LICENSE.txt,sha256=SmF_cYmqkr47iujpTjFXSkaqdVDp9pTuQDbPN-GKUZc,4280
831,832c831,832
< airflow/www/static/dist/grid.70939cd423edfd7d6e08.js,sha256=S_M57tluGp6L_9QDNwR78XCXNss_-eD1gJ2tPGYDEb8,3967483
< airflow/www/static/dist/grid.70939cd423edfd7d6e08.js.LICENSE.txt,sha256=sQtXf7GfCo89PP-yECRigR0zh4q51Ul0rglmaW4nDqs,5245
---
> airflow/www/static/dist/grid.9dfc288c631a1f964c7a.js,sha256=hT5Qwi8bzeMUKcuOKpdLHYVA9MUloqLficGjzpeGiwg,3967582
> airflow/www/static/dist/grid.9dfc288c631a1f964c7a.js.LICENSE.txt,sha256=sQtXf7GfCo89PP-yECRigR0zh4q51Ul0rglmaW4nDqs,5245
843,846c843,846
< airflow/www/static/dist/main.8461584ab30f513901c2.css,sha256=oXiTMwdpRp556KBwH0NCOO_L5H4OIt5EapaujfpXqws,7304
< airflow/www/static/dist/main.8461584ab30f513901c2.js,sha256=Jtgg3qTAAwOpG-EbaDVeqJL2bZIjA-uDF0ZAg4yhq_M,5366
< airflow/www/static/dist/main.8461584ab30f513901c2.js.LICENSE.txt,sha256=FX9Q5lmXWsdQeuAN5eIQlag1BKUFTx3rpUftXrj5Et4,809
< airflow/www/static/dist/manifest.json,sha256=PF2FtfzT9gRPoEwpqdOW_BXmTgUDGjb5vlwtVg3bljI,4419
---
> airflow/www/static/dist/main.fb487bd34c7cd20f02bc.css,sha256=oXiTMwdpRp556KBwH0NCOO_L5H4OIt5EapaujfpXqws,7304
> airflow/www/static/dist/main.fb487bd34c7cd20f02bc.js,sha256=ZjQdt94DGbnkMxJ-EssgDrlNe3KC0SORzgagX85uoVs,5452
> airflow/www/static/dist/main.fb487bd34c7cd20f02bc.js.LICENSE.txt,sha256=FX9Q5lmXWsdQeuAN5eIQlag1BKUFTx3rpUftXrj5Et4,809
> airflow/www/static/dist/manifest.json,sha256=NLyXgQkYDEKMEH8xezZTiFF6w6hMmoFD0s6RBmgMarw,4419
859,860c859,860
< airflow/www/static/dist/taskInstances.a65435400ad9c5e928c1.js,sha256=iM6DYjtsw9vuqc28swtZSJgwn_6zigt1kSLz1j0SSm8,86420
< airflow/www/static/dist/taskInstances.a65435400ad9c5e928c1.js.LICENSE.txt,sha256=AXbnnahJ1YVvF7cQaDFWtahoDOz70YRel1q39jKviR8,1384
---
> airflow/www/static/dist/taskInstances.7a19b383b09d370fe8a0.js,sha256=OdY9NgfXl7nVrAwvcjMlqHnXF2xMotXhiJTOP8aybUU,86506
> airflow/www/static/dist/taskInstances.7a19b383b09d370fe8a0.js.LICENSE.txt,sha256=AXbnnahJ1YVvF7cQaDFWtahoDOz70YRel1q39jKviR8,1384
865,866c865,866
< airflow/www/static/dist/trigger.d972e04a6a32368ffc7e.js,sha256=wvnRSPCc-YLqqcq4lN6xfyC9D8LguDzxd6MDy4RGjAs,4215
< airflow/www/static/dist/trigger.d972e04a6a32368ffc7e.js.LICENSE.txt,sha256=FX9Q5lmXWsdQeuAN5eIQlag1BKUFTx3rpUftXrj5Et4,809
---
> airflow/www/static/dist/trigger.cef24b4966646f363d5a.js,sha256=am6cJ3g7X95X5ePUM4tEkdie7fnUdHR1XCKh9Vo2HDo,4207
> airflow/www/static/dist/trigger.cef24b4966646f363d5a.js.LICENSE.txt,sha256=FX9Q5lmXWsdQeuAN5eIQlag1BKUFTx3rpUftXrj5Et4,809
908c908
< airflow/www/static/js/main.js,sha256=Q2sJr4Hm8p0kmJXRx2yeprSb92Q0tt5TSQvTnJyWpEI,7953
---
> airflow/www/static/js/main.js,sha256=QhBmPQb391AItfUnuPVVv0NTZ_09AGTRH-kBsKsPInQ,8092
914c914
< airflow/www/static/js/trigger.js,sha256=mtoIGCTm5t5DOkrS1wYMfRhf4Sk_jlbFGlz_b2jECjQ,9730
---
> airflow/www/static/js/trigger.js,sha256=GPz9Me2-MpFdriQ7-BOaHhky7c4K_dBiSdcc6rumhHI,9730
955c955
< airflow/www/static/js/api/useTaskXcom.ts,sha256=TEhpgznfJpjsGX2mhYXwq1zbc7hNatu552-R3ijcVBE,2270
---
> airflow/www/static/js/api/useTaskXcom.ts,sha256=UHwS1RUmrdO4YvZFlciZhovbbn76ABfkXRYag4Gbq20,2347
1055c1055
< airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx,sha256=GMha9M50_25hpOsOrIChwd2Hmy1HLhKmuvn2up3bWME,2256
---
> airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx,sha256=jtmJ_8CWZjMdbHnoZ2ThHOkY_JGmPyDjuGP_kzvqMYM,2659
1142c1142
< airflow/www/templates/airflow/trigger.html,sha256=zM9o5RMnv-L5f6B7-Kqca_nKFVC08wo1pMb_T3fPB0Q,17073
---
> airflow/www/templates/airflow/trigger.html,sha256=bOntLWS6Itsx27H9NNKT4IUwxQPNih2W6aRcYYemgQE,17133
1161,1162c1161,1162
< apache_airflow-2.10.4.dist-info/METADATA,sha256=eu4fno5zrcRyup713uEJ0ykjC9UHcRiY1QDuFAX1ANM,43527
< apache_airflow-2.10.4.dist-info/WHEEL,sha256=C2FUgwZgiLbznR-k0b_5k3Ai_1aASOXDss3lzCUsUug,87
---
> apache_airflow-2.10.4.dist-info/METADATA,sha256=vMGaVysA0MXf5ftvq2Qt72rDbBqm73bpDBPVGWgbc90,44764
> apache_airflow-2.10.4.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
diff -r ./old/apache_airflow-2.10.4.dist-info/WHEEL ./new/apache_airflow-2.10.4.dist-info/WHEEL
2c2
< Generator: hatchling 1.26.3
---
> Generator: hatchling 1.27.0

@potiuk potiuk merged commit c2311e7 into v2-10-test Jan 12, 2025
115 checks passed
@potiuk potiuk deleted the apply-workflow-run-removal branch January 12, 2025 12:01
@utkarsharma2 utkarsharma2 added the changelog:skip Changes that should be skipped from the changelog (CI, tests, etc..) label Jan 28, 2025
@utkarsharma2 utkarsharma2 added this to the Airflow 2.10.5 milestone Jan 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:dev-tools changelog:skip Changes that should be skipped from the changelog (CI, tests, etc..)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants