feat(airflow): Update docker.io/apache/airflow Docker tag to v2.10.5 #507
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
2.8.2-python3.10
->2.10.5-python3.10
Release Notes
apache/airflow (docker.io/apache/airflow)
v2.10.5
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
Ensure teardown tasks are executed when DAG run is set to failed (#45530)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously when a DAG run was manually set to "failed" or to "success" state the terminal state was set to all tasks.
But this was a gap for cases when setup- and teardown tasks were defined: If teardown was used to clean-up infrastructure
or other resources, they were also skipped and thus resources could stay allocated.
As of now when setup tasks had been executed before and the DAG is manually set to "failed" or "success" then teardown
tasks are executed. Teardown tasks are skipped if the setup was also skipped.
As a side effect this means if the DAG contains teardown tasks, then the manual marking of DAG as "failed" or "success"
will need to keep the DAG in running state to ensure that teardown tasks will be scheduled. They would not be scheduled
if the DAG is directly set to "failed" or "success".
Bug Fixes
"""""""""
trigger_rule=TriggerRule.ALWAYS
in a task-generated mapping within bare tasks (#44751)ONE_DONE
) in a mapped task group (#44937)FileTaskHandler
only read from default executor (#46000)skip_if
andrun_if
decorators before TaskFlow virtualenv tasks are run (#41832) (#45680)rendered_map_index
(#45109) (#45122)max_form_parts
,max_form_memory_size
(#46243) (#45749)execute
safeguard mechanism (#44646) (#46280)Miscellaneous
"""""""""""""
conf
from Task Context (#44993)v2.10.4
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
TaskInstance
priority_weight
is capped in 32-bit signed integer ranges (#43611)"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Some database engines are limited to 32-bit integer values. As some users reported errors in
weight rolled-over to negative values, we decided to cap the value to the 32-bit integer. Even
if internally in python smaller or larger values to 64 bit are supported,
priority_weight
iscapped and only storing values from -2147483648 to
2147483
.Bug Fixes
^^^^^^^^^
trigger_rule="always"
in a dynamic mapped task (#43810)trigger_rule=TriggerRule.ALWAYS
in a task-generated mapping within bare tasks (#44751)Doc Only Changes
""""""""""""""""
Miscellaneous
"""""""""""""
v2.10.3
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes
"""""""""
stringified
objects to UI via xcom if pickling is active (#42388) (#42486)selectinload
instead ofjoinedload
(#40487) (#42351)TrySelector
for Mapped Tasks in Logs and Details Grid Panel (#43566)scheduler_loop_duration
(#42886) (#43544)Miscellaneous
"""""""""""""
dompurify
from 2.2.9 to 2.5.6 in /airflow/www (#42263) (#42270)4.5.2
(#43309) (#43318)Doc Only Changes
""""""""""""""""
v2.10.2
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes
"""""""""
renderedTemplates
as keys to skipcamelCasing
(#42206) (#42208)camelcase
xcom entries (#42182) (#42187)Miscellaneous
"""""""""""""
0.2.4
as it breaks our integration (#42101)LibCST
(#42089)--tree
flag fortasks list
cli command (#41965)Doc Only Changes
""""""""""""""""
security_model.rst
to clear unauthenticated endpoints exceptions (#42085)v2.10.1
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes
"""""""""
__name__
(#41699)tojson
filter to example_inlet_event_extra example dag (#41890)Miscellaneous
"""""""""""""
Doc Only Changes
""""""""""""""""
keycloak
(#41791)v2.10.0
Compare Source
Significant Changes
^^^^^^^^^^^^^^^^^^^
Scarf based telemetry: Airflow now collect telemetry data (#39510)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow integrates Scarf to collect basic usage data during operation. Deployments can opt-out of data collection by
setting the
[usage_data_collection]enabled
option toFalse
, or theSCARF_ANALYTICS=false
environment variable.Datasets no longer trigger inactive DAGs (#38891)
"""""""""""""""""""""""""""""""""""""""""""""""""
Previously, when a DAG is paused or removed, incoming dataset events would still
trigger it, and the DAG would run when it is unpaused or added back in a DAG
file. This has been changed; a DAG's dataset schedule can now only be satisfied
by events that occur when the DAG is active. While this is a breaking change,
the previous behavior is considered a bug.
The behavior of time-based scheduling is unchanged, including the timetable part
of
DatasetOrTimeSchedule
.try_number
is no longer incremented during task execution (#39336)""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously, the try number (
try_number
) was incremented at the beginning of task execution on the worker. This was problematic for many reasons.For one it meant that the try number was incremented when it was not supposed to, namely when resuming from reschedule or deferral. And it also resulted in
the try number being "wrong" when the task had not yet started. The workarounds for these two issues caused a lot of confusion.
Now, instead, the try number for a task run is determined at the time the task is scheduled, and does not change in flight, and it is never decremented.
So after the task runs, the observed try number remains the same as it was when the task was running; only when there is a "new try" will the try number be incremented again.
One consequence of this change is, if users were "manually" running tasks (e.g. by calling
ti.run()
directly, or command lineairflow tasks run
),try number will no longer be incremented. Airflow assumes that tasks are always run after being scheduled by the scheduler, so we do not regard this as a breaking change.
/logout
endpoint in FAB Auth Manager is now CSRF protected (#40145)"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The
/logout
endpoint's method in FAB Auth Manager has been changed fromGET
toPOST
in all existingAuthViews (
AuthDBView
,AuthLDAPView
,AuthOAuthView
,AuthOIDView
,AuthRemoteUserView
), andnow includes CSRF protection to enhance security and prevent unauthorized logouts.
OpenTelemetry Traces for Apache Airflow (#37948).
"""""""""""""""""""""""""""""""""""""""""""""""""
This new feature adds capability for Apache Airflow to emit 1) airflow system traces of scheduler,
triggerer, executor, processor 2) DAG run traces for deployed DAG runs in OpenTelemetry format. Previously, only metrics were supported which emitted metrics in OpenTelemetry.
This new feature will add richer data for users to use OpenTelemetry standard to emit and send their trace data to OTLP compatible endpoints.
Decorator for Task Flow
(@​skip_if, @​run_if)
to make it simple to apply whether or not to skip a Task. (#41116)""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This feature adds a decorator to make it simple to skip a Task.
Using Multiple Executors Concurrently (#40701)
""""""""""""""""""""""""""""""""""""""""""""""
Previously known as hybrid executors, this new feature allows Airflow to use multiple executors concurrently. DAGs, or even individual tasks, can be configured
to use a specific executor that suits its needs best. A single DAG can contain tasks all using different executors. Please see the Airflow documentation for
more details. Note: This feature is still experimental. See
documentation on Executor <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/index.html#using-multiple-executors-concurrently>
_ for a more detailed description.New Features
""""""""""""
AIP-61 <https://github.com/apache/airflow/pulls?q=is%3Apr+label%3Aarea%3Ahybrid-executors+is%3Aclosed+milestone%3A%22Airflow+2.10.0%22>
_)AIP-62 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-62+milestone%3A%22Airflow+2.10.0%22>
_)AIP-64 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-64+milestone%3A%22Airflow+2.10.0%22>
_)AIP-44 <https://github.com/apache/airflow/pulls?q=is%3Apr+label%3AAIP-44+milestone%3A%22Airflow+2.10.0%22+is%3Aclosed>
_)accessors
to read dataset events defined as inlet (#39367)dag test
(#40010)endDate
in task instance tooltip. (#39547)accessors
to read dataset events defined as inlet (#39367, #39893)run_if
&skip_if
decorators (#41116)Improvements
""""""""""""
renderedjson
component (#40964)get_extra_dejson
method with nested parameter which allows you to specify if you want the nested json as string to be also deserialized (#39811)__getattr__
to task decorator stub (#39425)RemovedIn20Warning
inairflow task
command (#39244)db migrate
error messages (#39268)suppress_and_warn
warning (#39263)declarative_base
fromsqlalchemy.orm
instead ofsqlalchemy.ext.declarative
(#39134)on_task_instance_failed
access to the error that caused the failure (#38155)output_processor
parameter toBashProcessor
(#40843)Bug Fixes
"""""""""
never_fail
in BaseSensor (#40915)start_date
(#40878)external_task_group_id
toWorkflowTrigger
(#39617)BaseSensorOperator
introduceskip_policy
parameter (#40924)__init__
(#41086)Miscellaneous
"""""""""""""
OTel
Traces (#40874)pydocstyle
rules to pyproject.toml (#40569)pydocstyle
rule D213 in ruff. (#40448, #40464)Dag.test()
to run with an executor if desired (#40205)AirflowInternalRuntimeError
for raisenon catchable
errors (#38778)pytest
to 8.0+ (#39450)back_populates
betweenDagScheduleDatasetReference.dag
andDagModel.schedule_dataset_references
(#39392)B028
(no-explicit-stacklevel) in core (#39123)ImportError
toParseImportError
for avoid shadowing with builtin exception (#39116)SubDagOperator
examples warnings (#39057)model_dump
instead ofdict
for serialize Pydantic V2 model (#38933)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Renovate Bot.