Conversation
d23216a to
8bae37b
Compare
|
@tm-drtina What do you think of this diff of
diff --git a/pkgs/development/python-modules/apache-airflow/default.nix b/pkgs/development/python-modules/apache-airflow/default.nix
index 671dcc96f8e..20d739863eb 100644
--- a/pkgs/development/python-modules/apache-airflow/default.nix
+++ b/pkgs/development/python-modules/apache-airflow/default.nix
@@ -1,397 +1,314 @@
{ lib
, stdenv
+, python
, buildPythonPackage
-, callPackage
, fetchFromGitHub
-, fetchpatch
-, mkYarnPackage
-, python
-, pythonOlder
-, pythonAtLeast
-
-, apache-airflow-providers-ftp
-, apache-airflow-providers-http
-, apache-airflow-providers-imap
-, apache-airflow-providers-sqlite
-
, alembic
, argcomplete
, attrs
, blinker
, cached-property
, cattrs
-, configparser
-, colorlog_4
+, clickclick
+, colorlog
+, configupdater
, connexion
, cron-descriptor
, croniter
+, cryptography
+, dataclasses
, deprecated
, dill
, flask
+, flask_login
, flask-appbuilder
-, flask-admin
, flask-caching
-, flask-login_0_4
, flask-session
-, flask-swagger
, flask_wtf
-, flask-bcrypt
-, funcsigs
-, future
, GitPython
, graphviz
, gunicorn
, httpx
-, importlib-metadata
+, iso8601
, importlib-resources
+, importlib-metadata
+, inflection
+, itsdangerous
, jinja2
, jsonschema
, lazy-object-proxy
+, linkify-it-py
, lockfile
, markdown
, markupsafe
, marshmallow-oneofschema
-, packaging
+, mdit-py-plugins
+, numpy
+, openapi-spec-validator
+, pandas
, pathspec
, pendulum
-, pluggy
, psutil
, pygments
+, pyjwt
, python-daemon
, python-dateutil
, python-nvd3
, python-slugify
+, python3-openid
+, pythonOlder
+, pyyaml
, rich
, setproctitle
-, snakebite
, sqlalchemy
, sqlalchemy-jsonfield
+, swagger-ui-bundle
, tabulate
, tenacity
, termcolor
, typing-extensions
, unicodecsv
, werkzeug
-
-, beautifulsoup4
-, filelock
-, freezegun
-, jmespath
-, parameterized
-, pytest-asyncio
, pytestCheckHook
-}:
-
-let
- deselectedTestArgs = map (testPath: "--deselect '${testPath}'") deselectedTestPaths;
-
- # Following tests fail
- deselectedTestPaths = [
- "tests/always/test_connection.py::TestConnection::test_connection_test_hook_method_missing"
- "tests/always/test_connection.py::TestConnection::test_dbapi_get_sqlalchemy_engine"
- "tests/always/test_connection.py::TestConnection::test_dbapi_get_uri"
-
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunBatch"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunBatchPagination"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunBatchDateFilters"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunsEndDateFilters"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunsEndDateFilters"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestGetDagRunsPaginationFilters"
- "tests/api_connexion/endpoints/test_dag_run_endpoint.py::TestPostDagRun"
- "tests/api_connexion/endpoints/test_mapped_task_instance_endpoint.py::TestGetMappedTaskInstances::test_mapped_task_instances_with_date"
- "tests/api_connexion/endpoints/test_task_instance_endpoint.py::TestGetTaskInstancesBatch"
- "tests/api_connexion/endpoints/test_task_instance_endpoint.py::TestPostClearTaskInstances"
- "tests/api_connexion/endpoints/test_task_instance_endpoint.py::TestGetTaskInstances" # Some fails
- "tests/api_connexion/schemas/test_task_instance_schema.py::TestClearTaskInstanceFormSchema::test_validation_error_1"
-
- "tests/core/test_configuration.py::TestConf::test_config_from_secret_backend"
- "tests/core/test_configuration.py::TestConf::test_config_raise_exception_from_secret_backend_connection_error"
- "tests/core/test_configuration.py::TestConf::test_get_section_should_respect_cmd_env_variable"
- "tests/core/test_configuration.py::TestConf::test_broker_transport_options"
- "tests/core/test_configuration.py::TestConf::test_auth_backends_adds_session"
- "tests/core/test_configuration.py::TestConf::test_enum_default_task_weight_rule_from_conf"
- "tests/core/test_configuration.py::TestConf::test_enum_logging_levels"
- "tests/core/test_configuration.py::TestConf::test_as_dict_works_without_sensitive_cmds"
- "tests/core/test_configuration.py::TestConf::test_as_dict_respects_sensitive_cmds_from_env"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_deprecated_options_cmd"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_both_conf_and_env_are_empty[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_both_conf_and_env_are_empty[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_config[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_config[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_env[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_env[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_disabled_env[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_disabled_env[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_disabled_config[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_cmd_disabled_config[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_secrets_disabled_env[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_secrets_disabled_env[False]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_secrets_disabled_config[True]"
- "tests/core/test_configuration.py::TestDeprecatedConf::test_conf_as_dict_when_deprecated_value_in_secrets_disabled_config[False]"
- "tests/core/test_logging_config.py::TestLoggingSettings::test_loading_remote_logging_with_wasb_handler"
- "tests/core/test_logging_config.py::TestLoggingSettings::test_log_group_arns_remote_logging_with_cloudwatch_handler_0_cloudwatch_arn_aws_logs_aaaa_bbbbb_log_group_ccccc"
- "tests/core/test_logging_config.py::TestLoggingSettings::test_log_group_arns_remote_logging_with_cloudwatch_handler_1_cloudwatch_arn_aws_logs_aaaa_bbbbb_log_group_aws_ccccc"
- "tests/core/test_logging_config.py::TestLoggingSettings::test_log_group_arns_remote_logging_with_cloudwatch_handler_2_cloudwatch_arn_aws_logs_aaaa_bbbbb_log_group_aws_ecs_ccccc"
- "tests/core/test_providers_manager.py::TestProviderManager" # most of them
-
- "tests/decorators/test_python_virtualenv.py::TestPythonVirtualenvDecorator"
+, freezegun
+, mkYarnPackage
+, writeScript
- "tests/hooks/test_subprocess.py::TestSubprocessHook"
+# Extra airflow providers to enable
+, enabledProviders ? []
+}:
+let
+ version = "2.4.1";
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_extra_operator_links_not_loaded_in_scheduler_loop"
+ airflow-src = fetchFromGitHub rec {
+ owner = "apache";
+ repo = "airflow";
+ rev = "refs/tags/${version}";
+ # Required because the GitHub archive tarballs don't appear to include tests
+ leaveDotGit = true;
+ sha256 = "sha256-HpPL/ocV7hRhYXsjfXMYvlP83Vh15kXyjBgubsaqaE8=";
+ };
- "tests/operators/test_python.py::TestPythonVirtualenvOperator"
+ # airflow bundles a web interface, which is built using webpack by an undocumented shell script in airflow's source tree.
+ # This replicates this shell script, fixing bugs in yarn.lock and package.json
- "tests/models/test_taskinstance.py::TestMappedTaskInstanceReceiveValue::test_map_in_group"
- "tests/models/test_taskinstance.py::TestTaskInstance::test_render_k8s_pod_yaml"
+ airflow-frontend = mkYarnPackage {
+ name = "airflow-frontend";
- "tests/plugins/test_plugins_manager.py::TestPluginsRBAC"
- "tests/plugins/test_plugins_manager.py::TestPluginsManager::test_loads_filesystem_plugins"
- "tests/plugins/test_plugins_manager.py::TestPluginsManager::test_registering_plugin_listeners"
- "tests/plugins/test_plugins_manager.py::TestPluginsDirectorySource::test_should_return_correct_path_name"
+ src = "${airflow-src}/airflow/www";
+ packageJSON = ./package.json;
+ yarnLock = ./yarn.lock;
+ yarnNix = ./yarn.nix;
- "tests/secrets/test_secrets.py::TestConnectionsFromSecrets::test_backend_fallback_to_env_var"
- "tests/secrets/test_secrets.py::TestConnectionsFromSecrets::test_backends_kwargs"
- "tests/secrets/test_secrets.py::TestConnectionsFromSecrets::test_initialize_secrets_backends"
- "tests/secrets/test_secrets.py::TestVariableFromSecrets::test_backend_variable_order"
+ distPhase = "true";
- "tests/utils/test_cli_util.py::TestCliUtil::test_metrics_build" # $USER is not set?
- "tests/utils/test_process_utils.py::TestKillChildProcessesByPids" # ps is not on PATH
+ # The webpack license plugin tries to create /licenses when given the
+ # original relative path
+ postPatch = ''
+ sed -i 's!../../../../licenses/LICENSES-ui.txt!licenses/LICENSES-ui.txt!' webpack.config.js
+ '';
- "tests/www/api/experimental/test_dag_runs_endpoint.py::TestDagRunsEndpoint" # http 401
- "tests/www/api/experimental/test_endpoints.py::TestPoolApiExperimental" # http 401
- "tests/www/api/experimental/test_endpoints.py::TestApiExperimental" # http 401
- "tests/www/api/experimental/test_endpoints.py::TestLineageApiExperimental" # http 401
- "tests/www/views/test_views.py::test_configuration_expose_config"
- "tests/www/views/test_views.py::test_plugin_should_list_on_page_with_details"
- "tests/www/views/test_views_acl.py::test_dag_autocomplete_success"
- "tests/www/views/test_views_acl.py::test_permission_exist"
- "tests/www/views/test_views_acl.py::test_role_permission_associate"
- "tests/www/views/test_views_extra_links.py::test_global_extra_links_works" # http 404
- "tests/www/views/test_views_extra_links.py::test_extra_link_in_gantt_view" # Missing github?
- "tests/www/views/test_views_extra_links.py::test_operator_extra_link_override_plugin" # 1.10.5 links?
- "tests/www/views/test_views_extra_links.py::test_operator_extra_link_multiple_operators" # 1.10.5 links?
+ configurePhase = ''
+ cp -r $node_modules node_modules
+ '';
- # Missing provider
- "tests/task/task_runner/test_task_runner.py::GetTaskRunner::test_should_have_valid_imports_1_airflow_task_task_runner_cgroup_task_runner_CgroupTaskRunner"
- "tests/api/auth/test_client.py::TestGetCurrentApiClient::test_should_create_google_open_id_client"
+ buildPhase = ''
+ yarn --offline build
+ find package.json yarn.lock static/css static/js -type f | sort | xargs md5sum > static/dist/sum.md5
+ '';
- # Celery and Kubernetes missing
- "tests/executors/test_executor_loader.py::TestExecutorLoader::test_should_support_executor_from_core_0_CeleryExecutor"
- "tests/executors/test_executor_loader.py::TestExecutorLoader::test_should_support_executor_from_core_1_CeleryKubernetesExecutor"
- "tests/executors/test_executor_loader.py::TestExecutorLoader::test_should_support_executor_from_core_3_KubernetesExecutor"
- ] ++ lib.optionals stdenv.isDarwin [
- "tests/dag_processing/test_manager.py::TestDagFileProcessorAgent::test_parse_once"
- "tests/dag_processing/test_manager.py::TestDagFileProcessorManager::test_dag_with_system_exit"
- "tests/executors/test_local_executor.py::TestLocalExecutor::test_execution_limited_parallelism_fork"
- "tests/executors/test_local_executor.py::TestLocalExecutor::test_execution_subprocess_limited_parallelism"
- "tests/executors/test_local_executor.py::TestLocalExecutor::test_execution_subprocess_unlimited_parallelism"
- "tests/executors/test_local_executor.py::TestLocalExecutor::test_execution_unlimited_parallelism_fork"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_fail"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_success"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_root_fail"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_root_fail_unfinished"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_root_after_dagrun_unfinished"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_deadlock_ignore_depends_on_past_advance_ex_date"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_dagrun_deadlock_ignore_depends_on_past"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_start_date[configs0]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_start_date[configs1]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_task_start_date[configs0]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_task_start_date[configs1]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_multiprocessing[configs0]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_scheduler_multiprocessing[configs1]"
- "tests/jobs/test_scheduler_job.py::TestSchedulerJob::test_retry_still_in_executor"
- "tests/operators/test_bash.py::TestBashOperator::test_bash_operator_kill"
- "tests/utils/test_process_utils.py::TestCheckIfPidfileProcessIsRunning::test_remove_if_no_process"
- "tests/utils/test_dates.py::TestDates::test_days_ago"
- ];
+ installPhase = ''
+ mkdir -p $out/static/
+ cp -r static/dist $out/static
+ '';
+ };
+ # Import generated file with metadata for provider dependencies and imports.
+ # Enable additional providers using enabledProviders above.
+ providers = import ./providers.nix;
+ getProviderDeps = provider: map (dep: python.pkgs.${dep}) providers.${provider}.deps;
+ getProviderImports = provider: providers.${provider}.imports;
+ providerDependencies = lib.concatMap getProviderDeps enabledProviders;
+ providerImports = lib.concatMap getProviderImports enabledProviders;
in
buildPythonPackage rec {
pname = "apache-airflow";
- version = "2.3.2";
- disabled = pythonOlder "3.7" || pythonAtLeast "3.11";
-
- src = fetchFromGitHub rec {
- owner = "apache";
- repo = "airflow";
- rev = version;
- sha256 = "sha256-tI6dX2JVviEFDNKxDHRulE+kOFr/X2lakk0M3z1CRjs=";
-
- # HACK: To zip sources doesn't have tests folder.
- # This forces git fetch instead of zip fetch.
- fetchSubmodules = true;
- };
+ inherit version;
+ src = airflow-src;
- patches = [
- (fetchpatch {
- url = "https://github.com/apache/airflow/commit/9f58e823329d525c0e2b3950ada7e0e047ee7cfd.patch";
- sha256 = "sha256-RrqFuFvLSNX+30kKsWrNW/BaRi3+g+P6STlKaMhWKng=";
- })
- ];
+ disabled = pythonOlder "3.6";
propagatedBuildInputs = [
- apache-airflow-providers-ftp
- apache-airflow-providers-http
- apache-airflow-providers-imap
- apache-airflow-providers-sqlite
-
alembic
argcomplete
attrs
blinker
+ cached-property
cattrs
- colorlog_4
+ clickclick
+ colorlog
+ configupdater
connexion
cron-descriptor
croniter
+ cryptography
deprecated
dill
flask
flask-appbuilder
flask-caching
- flask-login_0_4
flask-session
flask_wtf
+ flask_login
GitPython
graphviz
gunicorn
httpx
+ iso8601
+ importlib-resources
+ inflection
+ itsdangerous
jinja2
jsonschema
lazy-object-proxy
+ linkify-it-py
lockfile
markdown
markupsafe
marshmallow-oneofschema
- packaging
+ mdit-py-plugins
+ numpy
+ openapi-spec-validator
+ pandas
pathspec
pendulum
- pluggy
psutil
pygments
+ pyjwt
python-daemon
python-dateutil
python-nvd3
python-slugify
+ python3-openid
+ pyyaml
rich
setproctitle
sqlalchemy
sqlalchemy-jsonfield
+ swagger-ui-bundle
tabulate
tenacity
termcolor
typing-extensions
unicodecsv
werkzeug
- ] ++ lib.optional (pythonOlder "3.8") [
- cached-property
- ] ++ lib.optional (pythonOlder "3.9") [
+ ] ++ lib.optionals (pythonOlder "3.7") [
+ dataclasses
+ ] ++ lib.optionals (pythonOlder "3.9") [
importlib-metadata
- importlib-resources
+ ] ++ providerDependencies;
+
+ buildInputs = [
+ airflow-frontend
];
checkInputs = [
- beautifulsoup4
- filelock
freezegun
- jmespath
- parameterized
- pytest-asyncio
pytestCheckHook
];
- # allow for gunicorn processes to have access to python packages
- makeWrapperArgs = [ "--prefix PYTHONPATH : $PYTHONPATH" ];
-
- pytestFlagsArray = [
- "--disable-pytest-warnings"
- ] ++ deselectedTestArgs;
-
- ## Following tests fail while init due to import error
- disabledTestPaths = [
- # Requires celery
- "tests/cli/"
- "tests/executors/test_celery_executor.py"
- "tests/executors/test_celery_kubernetes_executor.py"
- "tests/executors/test_kubernetes_executor.py"
- "tests/executors/test_local_kubernetes_executor.py"
- "tests/www/views/test_views_dagrun.py"
- "tests/www/views/test_views_tasks.py"
-
- # Requires docker
- "docker_tests/"
-
- # Requires kubernetes
- "kubernetes_tests/"
- "tests/kubernetes/"
- "tests/serialization/test_dag_serialization.py"
-
- # Requires dask
- "tests/executors/test_dask_executor.py"
-
- # Requires cgroupspy
- "tests/task/task_runner/test_cgroup_task_runner.py"
-
- # Requires psycopg
- "tests/operators/test_generic_transfer.py"
- "tests/operators/test_sql.py"
-
- # Requires statsd
- "tests/core/test_stats.py"
-
- # Requires numpy
- "tests/utils/test_json.py"
-
- # Requires pandas
- "tests/sensors/test_sql_sensor.py"
-
- # Requires sentry_sdk
- "tests/core/test_sentry.py"
-
- # Most of them requires helm/kubernetes
- "tests/charts/"
-
- # Requires google
- "tests/api_connexion/endpoints/test_extra_link_endpoint.py"
+ # By default, source code of providers is included but unusable due to missing
+ # transitive dependencies. To enable a provider, add it to extraProviders
+ # above
+ INSTALL_PROVIDERS_FROM_SOURCES = "true";
+
+ postPatch = ''
+ substituteInPlace setup.cfg \
+ --replace "colorlog>=4.0.2, <5.0" "colorlog" \
+ --replace "pathspec~=0.9.0" "pathspec"
+ '' + lib.optionalString stdenv.isDarwin ''
+ # Fix failing test on Hydra
+ substituteInPlace airflow/utils/db.py \
+ --replace "/tmp/sqlite_default.db" "$TMPDIR/sqlite_default.db"
+ '';
- # Requires kerberos
- "tests/api/auth/backend/test_kerberos_auth.py"
+ # allow for gunicorn processes to have access to Python packages
+ makeWrapperArgs = [
+ "--prefix PYTHONPATH : $PYTHONPATH"
+ ];
- # Requires breeze
- "dev/breeze/tests/"
+ postInstall = ''
+ cp -rv ${airflow-frontend}/static/dist $out/lib/${python.libPrefix}/site-packages/airflow/www/static
+ # Needed for pythonImportsCheck below
+ export HOME=$(mktemp -d)
+ '';
- # Uses all providers, but not are installed
- "tests/always/test_deprecations.py"
- "tests/always/test_example_dags.py"
- "tests/providers/"
- "tests/system/providers/"
- ];
+ pythonImportsCheck = [
+ "airflow"
+ ] ++ providerImports;
preCheck = ''
- export HOME=$(mktemp -d)
export AIRFLOW_HOME=$HOME
export AIRFLOW__CORE__UNIT_TEST_MODE=True
+ export AIRFLOW_DB="$HOME/airflow.db"
export PATH=$PATH:$out/bin
- airflow version
- airflow db init
- airflow db reset -y
+ airflow version
+ airflow db init
+ airflow db reset -y
'';
- postInstall = let
- frontend = callPackage ./frontend.nix {
- inherit mkYarnPackage;
- src = "${src}/airflow/www";
- };
- in ''
- cp -rv ${frontend}/static/dist $out/${python.sitePackages}/airflow/www/static
- '';
+ pytestFlagsArray = [
+ "tests/core/test_core.py"
+ ];
+
+ disabledTests = lib.optionals stdenv.isDarwin [
+ "bash_operator_kill" # psutil.AccessDenied
+ ];
+
+ # Updates yarn.lock and package.json
+ passthru.updateScript = writeScript "update.sh" ''
+ #!/usr/bin/env nix-shell
+ #!nix-shell -i bash -p common-updater-scripts curl pcre "python3.withPackages (ps: with ps; [ pyyaml ])" yarn2nix
+
+ set -euo pipefail
+
+ # Get new version
+ new_version="$(curl -s https://airflow.apache.org/docs/apache-airflow/stable/release_notes.html |
+ pcregrep -o1 'Airflow ([0-9.]+).' | head -1)"
+ update-source-version ${pname} "$new_version"
+
+ # Update frontend
+ cd ./pkgs/development/python-modules/apache-airflow
+ curl -O https://raw.githubusercontent.com/apache/airflow/$new_version/airflow/www/yarn.lock
+ curl -O https://raw.githubusercontent.com/apache/airflow/$new_version/airflow/www/package.json
+ # Note: for 2.3.4 a manual change was needed to get a fully resolved URL for
+ # caniuse-lite@1.0.30001312 (with the sha after the #). The error from yarn
+ # was 'Can't make a request in offline mode' from yarn. Corrected install by
+ # manually running yarn add caniuse-lite@1.0.30001312 and copying the
+ # requisite section from the generated yarn.lock.
+ yarn2nix > yarn.nix
+
+ # update provider dependencies
+ ./update-providers.py
+ '';
+
+ # Note on testing the web UI:
+ # You can (manually) test the web UI as follows:
+ #
+ # nix shell .#python3Packages.apache-airflow
+ # airflow db init
+ # airflow reset -y # WARNING: this will wipe any existing db state you might have!
+ # airflow standalone
+ #
+ # Then navigate to the localhost URL using the credentials printed, try
+ # triggering the 'example_bash_operator' and 'example_bash_operator' DAGs and
+ # see if they report success.
meta = with lib; {
description = "Programmatically author, schedule and monitor data pipelines";
- homepage = "http://airflow.apache.org/";
+ homepage = "https://airflow.apache.org/";
license = licenses.asl20;
- maintainers = with maintainers; [ bhipple costrouc ingenieroariel ];
+ maintainers = with maintainers; [ bhipple gbpdt ingenieroariel ];
};
}
diff --git a/pkgs/development/python-modules/apache-airflow/frontend.nix b/pkgs/development/python-modules/apache-airflow/frontend.nix
deleted file mode 100644
index 32c4f6034fb..00000000000
--- a/pkgs/development/python-modules/apache-airflow/frontend.nix
+++ /dev/null
@@ -1,31 +0,0 @@
-{ src, mkYarnPackage }:
-
-let
- # HACK: these fields are missing in upstream package.json, but are required by mkYarnPackage
- additionalFields = {
- name = "airflow-frontend";
- version = "1.0.0";
- };
- packageJSON = builtins.fromJSON (builtins.readFile "${src}/package.json");
- patchedPackageJSON = builtins.toFile "package.json" (builtins.toJSON (packageJSON // additionalFields));
-in
-mkYarnPackage {
- name = "airflow-frontend";
- inherit src;
- packageJSON = patchedPackageJSON;
- packageResolutions = packageJSON.resolutions;
- doDist = false;
-
- configurePhase = ''
- cp -r $node_modules node_modules
- '';
-
- buildPhase = ''
- yarn --offline build
- '';
-
- installPhase = ''
- mkdir -p $out/static/
- cp -r static/dist $out/static
- '';
-}
diff --git a/pkgs/development/python-modules/apache-airflow/providers.nix b/pkgs/development/python-modules/apache-airflow/providers.nix
new file mode 100644
index 00000000000..3c8205cfcb6
--- /dev/null
+++ b/pkgs/development/python-modules/apache-airflow/providers.nix
@@ -0,0 +1,311 @@
+# Warning: generated by update-providers.py, do not update manually
+{
+ airbyte = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.airbyte.hooks.airbyte" "airflow.providers.airbyte.operators.airbyte" ];
+ };
+ alibaba = {
+ deps = [ "oss2" ];
+ imports = [ "airflow.providers.alibaba.cloud.hooks.oss" "airflow.providers.alibaba.cloud.operators.oss" ];
+ };
+ amazon = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.amazon.aws.hooks.appflow" "airflow.providers.amazon.aws.hooks.athena" "airflow.providers.amazon.aws.hooks.base_aws" "airflow.providers.amazon.aws.hooks.batch_client" "airflow.providers.amazon.aws.hooks.batch_waiters" "airflow.providers.amazon.aws.hooks.cloud_formation" "airflow.providers.amazon.aws.hooks.datasync" "airflow.providers.amazon.aws.hooks.dms" "airflow.providers.amazon.aws.hooks.dynamodb" "airflow.providers.amazon.aws.hooks.ec2" "airflow.providers.amazon.aws.hooks.ecs" "airflow.providers.amazon.aws.hooks.eks" "airflow.providers.amazon.aws.hooks.elasticache_replication_group" "airflow.providers.amazon.aws.hooks.emr" "airflow.providers.amazon.aws.hooks.emr" "airflow.providers.amazon.aws.hooks.glacier" "airflow.providers.amazon.aws.hooks.glue" "airflow.providers.amazon.aws.hooks.glue_catalog" "airflow.providers.amazon.aws.hooks.glue_crawler" "airflow.providers.amazon.aws.hooks.kinesis" "airflow.providers.amazon.aws.hooks.lambda_function" "airflow.providers.amazon.aws.hooks.logs" "airflow.providers.amazon.aws.hooks.quicksight" "airflow.providers.amazon.aws.hooks.rds" "airflow.providers.amazon.aws.hooks.redshift_cluster" "airflow.providers.amazon.aws.hooks.redshift_data" "airflow.providers.amazon.aws.hooks.redshift_sql" "airflow.providers.amazon.aws.hooks.s3" "airflow.providers.amazon.aws.hooks.sagemaker" "airflow.providers.amazon.aws.hooks.secrets_manager" "airflow.providers.amazon.aws.hooks.ses" "airflow.providers.amazon.aws.hooks.sns" "airflow.providers.amazon.aws.hooks.sqs" "airflow.providers.amazon.aws.hooks.step_function" "airflow.providers.amazon.aws.hooks.sts" "airflow.providers.amazon.aws.operators.appflow" "airflow.providers.amazon.aws.operators.athena" "airflow.providers.amazon.aws.operators.aws_lambda" "airflow.providers.amazon.aws.operators.batch" "airflow.providers.amazon.aws.operators.cloud_formation" "airflow.providers.amazon.aws.operators.datasync" "airflow.providers.amazon.aws.operators.dms" "airflow.providers.amazon.aws.operators.ec2" "airflow.providers.amazon.aws.operators.ecs" "airflow.providers.amazon.aws.operators.eks" "airflow.providers.amazon.aws.operators.emr" "airflow.providers.amazon.aws.operators.emr" "airflow.providers.amazon.aws.operators.glacier" "airflow.providers.amazon.aws.operators.glue" "airflow.providers.amazon.aws.operators.glue_crawler" "airflow.providers.amazon.aws.operators.lambda_function" "airflow.providers.amazon.aws.operators.quicksight" "airflow.providers.amazon.aws.operators.rds" "airflow.providers.amazon.aws.operators.redshift_cluster" "airflow.providers.amazon.aws.operators.redshift_data" "airflow.providers.amazon.aws.operators.redshift_sql" "airflow.providers.amazon.aws.operators.s3" "airflow.providers.amazon.aws.operators.sagemaker" "airflow.providers.amazon.aws.operators.sns" "airflow.providers.amazon.aws.operators.sqs" "airflow.providers.amazon.aws.operators.step_function" ];
+ };
+ apache_beam = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.apache.beam.hooks.beam" "airflow.providers.apache.beam.operators.beam" ];
+ };
+ apache_cassandra = {
+ deps = [ "cassandra-driver" ];
+ imports = [ "airflow.providers.apache.cassandra.hooks.cassandra" ];
+ };
+ apache_drill = {
+ deps = [ ];
+ imports = [ "airflow.providers.apache.drill.hooks.drill" "airflow.providers.apache.drill.operators.drill" ];
+ };
+ apache_druid = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.apache.druid.hooks.druid" "airflow.providers.apache.druid.operators.druid" "airflow.providers.apache.druid.operators.druid_check" ];
+ };
+ apache_hdfs = {
+ deps = [ ];
+ imports = [ "airflow.providers.apache.hdfs.hooks.hdfs" "airflow.providers.apache.hdfs.hooks.webhdfs" ];
+ };
+ apache_hive = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.apache.hive.hooks.hive" "airflow.providers.apache.hive.operators.hive" "airflow.providers.apache.hive.operators.hive_stats" ];
+ };
+ apache_kylin = {
+ deps = [ ];
+ imports = [ "airflow.providers.apache.kylin.hooks.kylin" "airflow.providers.apache.kylin.operators.kylin_cube" ];
+ };
+ apache_livy = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.apache.livy.hooks.livy" "airflow.providers.apache.livy.operators.livy" ];
+ };
+ apache_pig = {
+ deps = [ ];
+ imports = [ "airflow.providers.apache.pig.hooks.pig" "airflow.providers.apache.pig.operators.pig" ];
+ };
+ apache_pinot = {
+ deps = [ "ciso8601" ];
+ imports = [ "airflow.providers.apache.pinot.hooks.pinot" ];
+ };
+ apache_spark = {
+ deps = [ "pyspark" ];
+ imports = [ "airflow.providers.apache.spark.hooks.spark_jdbc" "airflow.providers.apache.spark.hooks.spark_jdbc_script" "airflow.providers.apache.spark.hooks.spark_sql" "airflow.providers.apache.spark.hooks.spark_submit" "airflow.providers.apache.spark.operators.spark_jdbc" "airflow.providers.apache.spark.operators.spark_sql" "airflow.providers.apache.spark.operators.spark_submit" ];
+ };
+ apache_sqoop = {
+ deps = [ ];
+ imports = [ "airflow.providers.apache.sqoop.hooks.sqoop" "airflow.providers.apache.sqoop.operators.sqoop" ];
+ };
+ arangodb = {
+ deps = [ ];
+ imports = [ "airflow.providers.arangodb.hooks.arangodb" "airflow.providers.arangodb.operators.arangodb" ];
+ };
+ asana = {
+ deps = [ "asana" ];
+ imports = [ "airflow.providers.asana.hooks.asana" "airflow.providers.asana.operators.asana_tasks" ];
+ };
+ atlassian_jira = {
+ deps = [ "jira" ];
+ imports = [ "airflow.providers.atlassian.jira.hooks.jira" "airflow.providers.atlassian.jira.operators.jira" ];
+ };
+ celery = {
+ deps = [ "celery" "flower" ];
+ imports = [ ];
+ };
+ cloudant = {
+ deps = [ ];
+ imports = [ "airflow.providers.cloudant.hooks.cloudant" ];
+ };
+ cncf_kubernetes = {
+ deps = [ "cryptography" "kubernetes" ];
+ imports = [ "airflow.providers.cncf.kubernetes.hooks.kubernetes" "airflow.providers.cncf.kubernetes.operators.kubernetes_pod" "airflow.providers.cncf.kubernetes.operators.spark_kubernetes" ];
+ };
+ common_sql = {
+ deps = [ "sqlparse" ];
+ imports = [ "airflow.providers.common.sql.hooks.sql" "airflow.providers.common.sql.operators.sql" ];
+ };
+ databricks = {
+ deps = [ "aiohttp" "databricks-sql-connector" "requests" ];
+ imports = [ "airflow.providers.databricks.hooks.databricks" "airflow.providers.databricks.hooks.databricks_base" "airflow.providers.databricks.hooks.databricks_sql" "airflow.providers.databricks.operators.databricks" "airflow.providers.databricks.operators.databricks_repos" "airflow.providers.databricks.operators.databricks_sql" ];
+ };
+ datadog = {
+ deps = [ "datadog" ];
+ imports = [ "airflow.providers.datadog.hooks.datadog" ];
+ };
+ dbt_cloud = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.dbt.cloud.hooks.dbt" "airflow.providers.dbt.cloud.operators.dbt" ];
+ };
+ dingding = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.dingding.hooks.dingding" "airflow.providers.dingding.operators.dingding" ];
+ };
+ discord = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.discord.hooks.discord_webhook" "airflow.providers.discord.operators.discord_webhook" ];
+ };
+ docker = {
+ deps = [ "docker" ];
+ imports = [ "airflow.providers.docker.hooks.docker" "airflow.providers.docker.operators.docker" "airflow.providers.docker.operators.docker_swarm" ];
+ };
+ elasticsearch = {
+ deps = [ "elasticsearch" "elasticsearch-dsl" ];
+ imports = [ "airflow.providers.elasticsearch.hooks.elasticsearch" ];
+ };
+ exasol = {
+ deps = [ "pandas" ];
+ imports = [ "airflow.providers.exasol.hooks.exasol" "airflow.providers.exasol.operators.exasol" ];
+ };
+ facebook = {
+ deps = [ ];
+ imports = [ "airflow.providers.facebook.ads.hooks.ads" ];
+ };
+ ftp = {
+ deps = [ ];
+ imports = [ "airflow.providers.ftp.hooks.ftp" ];
+ };
+ github = {
+ deps = [ "PyGithub" ];
+ imports = [ "airflow.providers.github.hooks.github" "airflow.providers.github.operators.github" ];
+ };
+ google = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.google.ads.hooks.ads" "airflow.providers.google.ads.operators.ads" "airflow.providers.google.cloud.hooks.automl" "airflow.providers.google.cloud.hooks.bigquery" "airflow.providers.google.cloud.hooks.bigquery_dts" "airflow.providers.google.cloud.hooks.bigtable" "airflow.providers.google.cloud.hooks.cloud_build" "airflow.providers.google.cloud.hooks.cloud_composer" "airflow.providers.google.cloud.hooks.cloud_memorystore" "airflow.providers.google.cloud.hooks.cloud_sql" "airflow.providers.google.cloud.hooks.cloud_storage_transfer_service" "airflow.providers.google.cloud.hooks.compute" "airflow.providers.google.cloud.hooks.compute_ssh" "airflow.providers.google.cloud.hooks.datacatalog" "airflow.providers.google.cloud.hooks.dataflow" "airflow.providers.google.cloud.hooks.dataform" "airflow.providers.google.cloud.hooks.datafusion" "airflow.providers.google.cloud.hooks.dataplex" "airflow.providers.google.cloud.hooks.dataprep" "airflow.providers.google.cloud.hooks.dataproc" "airflow.providers.google.cloud.hooks.dataproc_metastore" "airflow.providers.google.cloud.hooks.datastore" "airflow.providers.google.cloud.hooks.dlp" "airflow.providers.google.cloud.hooks.functions" "airflow.providers.google.cloud.hooks.gcs" "airflow.providers.google.cloud.hooks.gdm" "airflow.providers.google.cloud.hooks.kms" "airflow.providers.google.cloud.hooks.kubernetes_engine" "airflow.providers.google.cloud.hooks.life_sciences" "airflow.providers.google.cloud.hooks.looker" "airflow.providers.google.cloud.hooks.mlengine" "airflow.providers.google.cloud.hooks.natural_language" "airflow.providers.google.cloud.hooks.os_login" "airflow.providers.google.cloud.hooks.pubsub" "airflow.providers.google.cloud.hooks.secret_manager" "airflow.providers.google.cloud.hooks.spanner" "airflow.providers.google.cloud.hooks.speech_to_text" "airflow.providers.google.cloud.hooks.stackdriver" "airflow.providers.google.cloud.hooks.tasks" "airflow.providers.google.cloud.hooks.text_to_speech" "airflow.providers.google.cloud.hooks.translate" "airflow.providers.google.cloud.hooks.vertex_ai.auto_ml" "airflow.providers.google.cloud.hooks.vertex_ai.batch_prediction_job" "airflow.providers.google.cloud.hooks.vertex_ai.custom_job" "airflow.providers.google.cloud.hooks.vertex_ai.dataset" "airflow.providers.google.cloud.hooks.vertex_ai.endpoint_service" "airflow.providers.google.cloud.hooks.vertex_ai.hyperparameter_tuning_job" "airflow.providers.google.cloud.hooks.vertex_ai.model_service" "airflow.providers.google.cloud.hooks.video_intelligence" "airflow.providers.google.cloud.hooks.vision" "airflow.providers.google.cloud.hooks.workflows" "airflow.providers.google.cloud.operators.automl" "airflow.providers.google.cloud.operators.bigquery" "airflow.providers.google.cloud.operators.bigquery_dts" "airflow.providers.google.cloud.operators.bigtable" "airflow.providers.google.cloud.operators.cloud_build" "airflow.providers.google.cloud.operators.cloud_composer" "airflow.providers.google.cloud.operators.cloud_memorystore" "airflow.providers.google.cloud.operators.cloud_sql" "airflow.providers.google.cloud.operators.cloud_storage_transfer_service" "airflow.providers.google.cloud.operators.compute" "airflow.providers.google.cloud.operators.datacatalog" "airflow.providers.google.cloud.operators.dataflow" "airflow.providers.google.cloud.operators.dataform" "airflow.providers.google.cloud.operators.datafusion" "airflow.providers.google.cloud.operators.dataplex" "airflow.providers.google.cloud.operators.dataprep" "airflow.providers.google.cloud.operators.dataproc" "airflow.providers.google.cloud.operators.dataproc_metastore" "airflow.providers.google.cloud.operators.datastore" "airflow.providers.google.cloud.operators.dlp" "airflow.providers.google.cloud.operators.functions" "airflow.providers.google.cloud.operators.gcs" "airflow.providers.google.cloud.operators.kubernetes_engine" "airflow.providers.google.cloud.operators.life_sciences" "airflow.providers.google.cloud.operators.looker" "airflow.providers.google.cloud.operators.mlengine" "airflow.providers.google.cloud.operators.natural_language" "airflow.providers.google.cloud.operators.pubsub" "airflow.providers.google.cloud.operators.spanner" "airflow.providers.google.cloud.operators.speech_to_text" "airflow.providers.google.cloud.operators.stackdriver" "airflow.providers.google.cloud.operators.tasks" "airflow.providers.google.cloud.operators.text_to_speech" "airflow.providers.google.cloud.operators.translate" "airflow.providers.google.cloud.operators.translate_speech" "airflow.providers.google.cloud.operators.translate_speech" "airflow.providers.google.cloud.operators.vertex_ai.auto_ml" "airflow.providers.google.cloud.operators.vertex_ai.batch_prediction_job" "airflow.providers.google.cloud.operators.vertex_ai.custom_job" "airflow.providers.google.cloud.operators.vertex_ai.dataset" "airflow.providers.google.cloud.operators.vertex_ai.endpoint_service" "airflow.providers.google.cloud.operators.vertex_ai.hyperparameter_tuning_job" "airflow.providers.google.cloud.operators.vertex_ai.model_service" "airflow.providers.google.cloud.operators.video_intelligence" "airflow.providers.google.cloud.operators.vision" "airflow.providers.google.cloud.operators.workflows" "airflow.providers.google.common.hooks.base_google" "airflow.providers.google.common.hooks.discovery_api" "airflow.providers.google.firebase.hooks.firestore" "airflow.providers.google.firebase.operators.firestore" "airflow.providers.google.leveldb.hooks.leveldb" "airflow.providers.google.leveldb.operators.leveldb" "airflow.providers.google.marketing_platform.hooks.analytics" "airflow.providers.google.marketing_platform.hooks.campaign_manager" "airflow.providers.google.marketing_platform.hooks.display_video" "airflow.providers.google.marketing_platform.hooks.search_ads" "airflow.providers.google.marketing_platform.operators.analytics" "airflow.providers.google.marketing_platform.operators.campaign_manager" "airflow.providers.google.marketing_platform.operators.display_video" "airflow.providers.google.marketing_platform.operators.search_ads" "airflow.providers.google.suite.hooks.calendar" "airflow.providers.google.suite.hooks.drive" "airflow.providers.google.suite.hooks.sheets" "airflow.providers.google.suite.operators.sheets" ];
+ };
+ grpc = {
+ deps = [ "google-auth" "google-auth-httplib2" "grpcio" ];
+ imports = [ "airflow.providers.grpc.hooks.grpc" "airflow.providers.grpc.operators.grpc" ];
+ };
+ hashicorp = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "hvac" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.hashicorp.hooks.vault" ];
+ };
+ http = {
+ deps = [ "requests" "requests-toolbelt" ];
+ imports = [ "airflow.providers.http.hooks.http" "airflow.providers.http.operators.http" ];
+ };
+ imap = {
+ deps = [ ];
+ imports = [ "airflow.providers.imap.hooks.imap" ];
+ };
+ influxdb = {
+ deps = [ "influxdb-client" "requests" ];
+ imports = [ "airflow.providers.influxdb.hooks.influxdb" "airflow.providers.influxdb.operators.influxdb" ];
+ };
+ jdbc = {
+ deps = [ "JayDeBeApi" ];
+ imports = [ "airflow.providers.jdbc.hooks.jdbc" "airflow.providers.jdbc.operators.jdbc" ];
+ };
+ jenkins = {
+ deps = [ "python-jenkins" ];
+ imports = [ "airflow.providers.jenkins.hooks.jenkins" "airflow.providers.jenkins.operators.jenkins_job_trigger" ];
+ };
+ jira = {
+ deps = [ "jira" ];
+ imports = [ "airflow.providers.jira.hooks.jira" "airflow.providers.jira.operators.jira" ];
+ };
+ microsoft_azure = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.microsoft.azure.hooks.adx" "airflow.providers.microsoft.azure.hooks.asb" "airflow.providers.microsoft.azure.hooks.azure_batch" "airflow.providers.microsoft.azure.hooks.azure_container_instance" "airflow.providers.microsoft.azure.hooks.azure_container_registry" "airflow.providers.microsoft.azure.hooks.azure_container_volume" "airflow.providers.microsoft.azure.hooks.azure_cosmos" "airflow.providers.microsoft.azure.hooks.azure_data_factory" "airflow.providers.microsoft.azure.hooks.azure_data_lake" "airflow.providers.microsoft.azure.hooks.azure_fileshare" "airflow.providers.microsoft.azure.hooks.base_azure" "airflow.providers.microsoft.azure.hooks.batch" "airflow.providers.microsoft.azure.hooks.container_instance" "airflow.providers.microsoft.azure.hooks.container_registry" "airflow.providers.microsoft.azure.hooks.container_volume" "airflow.providers.microsoft.azure.hooks.cosmos" "airflow.providers.microsoft.azure.hooks.data_factory" "airflow.providers.microsoft.azure.hooks.data_lake" "airflow.providers.microsoft.azure.hooks.fileshare" "airflow.providers.microsoft.azure.hooks.synapse" "airflow.providers.microsoft.azure.hooks.wasb" "airflow.providers.microsoft.azure.operators.adls" "airflow.providers.microsoft.azure.operators.adls_delete" "airflow.providers.microsoft.azure.operators.adls_list" "airflow.providers.microsoft.azure.operators.adx" "airflow.providers.microsoft.azure.operators.asb" "airflow.providers.microsoft.azure.operators.azure_batch" "airflow.providers.microsoft.azure.operators.azure_container_instances" "airflow.providers.microsoft.azure.operators.azure_cosmos" "airflow.providers.microsoft.azure.operators.batch" "airflow.providers.microsoft.azure.operators.container_instances" "airflow.providers.microsoft.azure.operators.cosmos" "airflow.providers.microsoft.azure.operators.data_factory" "airflow.providers.microsoft.azure.operators.synapse" "airflow.providers.microsoft.azure.operators.wasb_delete_blob" ];
+ };
+ microsoft_mssql = {
+ deps = [ ];
+ imports = [ "airflow.providers.microsoft.mssql.hooks.mssql" "airflow.providers.microsoft.mssql.operators.mssql" ];
+ };
+ microsoft_psrp = {
+ deps = [ "pypsrp" ];
+ imports = [ "airflow.providers.microsoft.psrp.hooks.psrp" "airflow.providers.microsoft.psrp.operators.psrp" ];
+ };
+ microsoft_winrm = {
+ deps = [ "pywinrm" ];
+ imports = [ "airflow.providers.microsoft.winrm.hooks.winrm" "airflow.providers.microsoft.winrm.operators.winrm" ];
+ };
+ mongo = {
+ deps = [ "dnspython" "pymongo" ];
+ imports = [ "airflow.providers.mongo.hooks.mongo" ];
+ };
+ mysql = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.mysql.hooks.mysql" "airflow.providers.mysql.operators.mysql" ];
+ };
+ neo4j = {
+ deps = [ "neo4j" ];
+ imports = [ "airflow.providers.neo4j.hooks.neo4j" "airflow.providers.neo4j.operators.neo4j" ];
+ };
+ odbc = {
+ deps = [ "pyodbc" ];
+ imports = [ "airflow.providers.odbc.hooks.odbc" ];
+ };
+ openfaas = {
+ deps = [ ];
+ imports = [ "airflow.providers.openfaas.hooks.openfaas" ];
+ };
+ opsgenie = {
+ deps = [ ];
+ imports = [ "airflow.providers.opsgenie.hooks.opsgenie" "airflow.providers.opsgenie.hooks.opsgenie_alert" "airflow.providers.opsgenie.operators.opsgenie" "airflow.providers.opsgenie.operators.opsgenie_alert" ];
+ };
+ oracle = {
+ deps = [ ];
+ imports = [ "airflow.providers.oracle.hooks.oracle" "airflow.providers.oracle.operators.oracle" ];
+ };
+ pagerduty = {
+ deps = [ ];
+ imports = [ "airflow.providers.pagerduty.hooks.pagerduty" "airflow.providers.pagerduty.hooks.pagerduty_events" ];
+ };
+ papermill = {
+ deps = [ ];
+ imports = [ "airflow.providers.papermill.operators.papermill" ];
+ };
+ plexus = {
+ deps = [ "arrow" ];
+ imports = [ "airflow.providers.plexus.hooks.plexus" "airflow.providers.plexus.operators.job" ];
+ };
+ postgres = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.postgres.hooks.postgres" "airflow.providers.postgres.operators.postgres" ];
+ };
+ presto = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.presto.hooks.presto" ];
+ };
+ qubole = {
+ deps = [ "qds_sdk" ];
+ imports = [ "airflow.providers.qubole.hooks.qubole" "airflow.providers.qubole.hooks.qubole_check" "airflow.providers.qubole.operators.qubole" "airflow.providers.qubole.operators.qubole_check" ];
+ };
+ redis = {
+ deps = [ "redis" ];
+ imports = [ "airflow.providers.redis.hooks.redis" "airflow.providers.redis.operators.redis_publish" ];
+ };
+ salesforce = {
+ deps = [ "pandas" "simple-salesforce" ];
+ imports = [ "airflow.providers.salesforce.hooks.salesforce" "airflow.providers.salesforce.operators.bulk" "airflow.providers.salesforce.operators.salesforce_apex_rest" ];
+ };
+ samba = {
+ deps = [ "smbprotocol" ];
+ imports = [ "airflow.providers.samba.hooks.samba" ];
+ };
+ segment = {
+ deps = [ ];
+ imports = [ "airflow.providers.segment.hooks.segment" "airflow.providers.segment.operators.segment_track_event" ];
+ };
+ sendgrid = {
+ deps = [ "sendgrid" ];
+ imports = [ ];
+ };
+ sftp = {
+ deps = [ "paramiko" "pysftp" "sshtunnel" ];
+ imports = [ "airflow.providers.sftp.hooks.sftp" "airflow.providers.sftp.operators.sftp" ];
+ };
+ singularity = {
+ deps = [ ];
+ imports = [ "airflow.providers.singularity.operators.singularity" ];
+ };
+ slack = {
+ deps = [ "requests" "requests-toolbelt" "slack-sdk" ];
+ imports = [ "airflow.providers.slack.hooks.slack" "airflow.providers.slack.hooks.slack_webhook" "airflow.providers.slack.operators.slack" "airflow.providers.slack.operators.slack_webhook" ];
+ };
+ snowflake = {
+ deps = [ "requests" "requests-toolbelt" "slack-sdk" "snowflake-connector-python" "snowflake-sqlalchemy" ];
+ imports = [ "airflow.providers.snowflake.hooks.snowflake" "airflow.providers.snowflake.operators.snowflake" ];
+ };
+ sqlite = {
+ deps = [ ];
+ imports = [ "airflow.providers.sqlite.hooks.sqlite" "airflow.providers.sqlite.operators.sqlite" ];
+ };
+ ssh = {
+ deps = [ "paramiko" "sshtunnel" ];
+ imports = [ "airflow.providers.ssh.hooks.ssh" "airflow.providers.ssh.operators.ssh" ];
+ };
+ tableau = {
+ deps = [ ];
+ imports = [ "airflow.providers.tableau.hooks.tableau" "airflow.providers.tableau.operators.tableau" "airflow.providers.tableau.operators.tableau_refresh_workbook" ];
+ };
+ tabular = {
+ deps = [ ];
+ imports = [ "airflow.providers.tabular.hooks.tabular" ];
+ };
+ telegram = {
+ deps = [ "python-telegram-bot" ];
+ imports = [ "airflow.providers.telegram.hooks.telegram" "airflow.providers.telegram.operators.telegram" ];
+ };
+ trino = {
+ deps = [ "apache-beam" "azure-batch" "azure-cosmos" "azure-datalake-store" "azure-identity" "azure-keyvault-secrets" "azure-mgmt-containerinstance" "azure-mgmt-datafactory" "azure-mgmt-datalake-store" "azure-mgmt-resource" "azure-servicebus" "azure-storage-blob" "azure-storage-common" "azure-storage-file" "azure-synapse-spark" "boto3" "cassandra-driver" "cryptography" "dnspython" "google-api-core" "google-api-python-client" "google-auth" "google-auth-httplib2" "google-cloud-automl" "google-cloud-bigquery-datatransfer" "google-cloud-bigtable" "google-cloud-container" "google-cloud-datacatalog" "google-cloud-dataproc" "google-cloud-dlp" "google-cloud-kms" "google-cloud-language" "google-cloud-logging" "google-cloud-monitoring" "google-cloud-pubsub" "google-cloud-redis" "google-cloud-secret-manager" "google-cloud-spanner" "google-cloud-speech" "google-cloud-storage" "google-cloud-tasks" "google-cloud-texttospeech" "google-cloud-translate" "google-cloud-videointelligence" "google-cloud-vision" "grpcio-gcp" "httpx" "json-merge-patch" "jsonpath-ng" "kubernetes" "mysqlclient" "pandas" "paramiko" "proto-plus" "protobuf" "psycopg2" "pymongo" "pyopenssl" "pysftp" "simple-salesforce" "smbprotocol" "sshtunnel" "thrift" "vertica-python" ];
+ imports = [ "airflow.providers.trino.hooks.trino" "airflow.providers.trino.operators.trino" ];
+ };
+ vertica = {
+ deps = [ "vertica-python" ];
+ imports = [ "airflow.providers.vertica.hooks.vertica" "airflow.providers.vertica.operators.vertica" ];
+ };
+ yandex = {
+ deps = [ ];
+ imports = [ "airflow.providers.yandex.hooks.yandex" "airflow.providers.yandex.hooks.yandexcloud_dataproc" "airflow.providers.yandex.operators.yandexcloud_dataproc" ];
+ };
+ zendesk = {
+ deps = [ ];
+ imports = [ "airflow.providers.zendesk.hooks.zendesk" ];
+ };
+} |
tm-drtina
left a comment
There was a problem hiding this comment.
All patches to the Airflow itself are merged, so they should be already included.
However I would like to switch the flask-appbuilder-wtf3 patch to the one that got merged (the first one was broken)
Also added some PRs/patches that are not yet merged and we can use instead of jailbreaks
| # https://github.com/dpgaspar/Flask-AppBuilder/pull/1734 | ||
| name = "flask-appbuilder-wtf3.patch"; | ||
| url = "https://github.com/dpgaspar/Flask-AppBuilder/commit/bccb3d719cd3ceb872fe74a9ab304d74664fbf43.patch"; | ||
| sha256 = "sha256-24mlS3HIs77wKOlwdHah5oks31OOmCBHmcafZT2ITOc="; | ||
| excludes = [ | ||
| "requirements.txt" | ||
| "setup.py" | ||
| "examples/employees/app/views.py" | ||
| ]; |
There was a problem hiding this comment.
I'm little bit worried regarding this patch. The PR was broken, so I've create another PR, which resolved the issues. Should we switch to the other patch? (In next release the patch is included)
PR: dpgaspar/Flask-AppBuilder#1904
The squashed commit: dpgaspar/Flask-AppBuilder@b010cde
There was a problem hiding this comment.
I've noticed new versions are out, so the patch can be dropped entirely if we update
- Airflow to 2.4.2 apache/airflow@2.4.1...2.4.2#diff-fa602a8a75dc9dcc92261bac5f533c2a85e34fcceaff63b3a3a81d9acde2fc52L111
- Flask-AppBuilder to 4.1.4 dpgaspar/Flask-AppBuilder@v4.1.3...v4.1.4#diff-60f61ab7a8d1910d86d9fda2261620314edcae5894d5aaa236b821c7256badd7
But these changes would be better to get to master first probably.
There was a problem hiding this comment.
That sounds like a plan. If you haven't already done the team programming session, bumping to airflow 2.4.2 and applying the appbuilder patches would be an awesome contribution (both for us and upstream).
| --replace "colorlog>=4.0.2, <5.0" "colorlog" \ | ||
| --replace "pathspec~=0.9.0" "pathspec" |
There was a problem hiding this comment.
Notes:
The colorlog shouldn't (hopefully) affect us.
Not sure why the jailbreak of pathspec is here, we have the correct version included
There was a problem hiding this comment.
This change also seems like a good one to apply upstream. You have a really encyclopedic knowledge of airflow by now. I'm sure upstream would love to have your input!
| --replace "Flask-WTF>=0.14.2, <1.0.0" "Flask-WTF" \ | ||
| --replace "WTForms<3.0.0" "WTForms" \ | ||
| --replace "marshmallow-sqlalchemy>=0.22.0, <0.27.0" "marshmallow-sqlalchemy" \ | ||
| --replace "prison>=0.2.1, <1.0.0" "prison" |
There was a problem hiding this comment.
Nit: If we want to avoid jailbreaking too many packages, here are some relevant patches/PRs:
apispec: dpgaspar/Flask-AppBuilder#1903
Flask-WTF & WTForms: should be covered by the patch above (or we can include the "correct" version ranges from that commit)
marshmallow-sqlalchemy: dpgaspar/Flask-AppBuilder#1905
prison: I had patch in my first attempt of upgrading airflow: https://github.com/tm-drtina/nixpkgs/blob/8fcdb40f383c71894c05dfc5ab1a815d00589c97/pkgs/development/python-modules/prison/default.nix
There was a problem hiding this comment.
I would love if those got merged upstream but I can also see applying patches in the interim.
668bda5 to
7675bb4
Compare
f560ede to
677233b
Compare
(cherry picked from commit a02ab4d) Reason: pyjwt-2.4.0 is not affected by CVE-2022-29217.
needed for NixOps
by disabling broken test
As far as I can tell, this can cause compile failures when building the vendored abseil_cpp if choosing a target arch that does not support sse. This might be possible to determine programmatically, but it is more flexible to let the user decide.
For which no reliable default target architecture is provided upstream.
Co-authored-by: Sandro <sandro.jaeckel@gmail.com>
This unbreaks the NixOS module and tests that rely on the value in order to map the desired kafka version to a supported jre
As NixOS#157879 points-out, this attribute appears unused. Fixes NixOS#157879
As recommended in the discussion at NixOS#157883
a663613 to
de3c6ec
Compare
|
@tm-drtina I think your suggestions are good, but I want to keep this PR restricted to backports from upstream that aren't in release-21.11. Do you think this is ok to merge? |
tm-drtina
left a comment
There was a problem hiding this comment.
I think it is ok to merge (regarding Airflow related things). Some of my patches are slowly getting into respective packages, but will take a while before it propagates through the whole chain :)
The patches are nice to have, but not blockers, so I don't want to block this
To get: