Hera makes Python code easy to orchestrate on Argo Workflows through native Python integrations. It lets you construct and submit your Workflows entirely in Python.
See the Quick Start guide to start using Hera to orchestrate your Argo Workflows!
The Argo was constructed by the shipwright Argus,
and its crew were specially protected by the goddess Hera.
from hera.workflows import Steps, Workflow, script
@script()
def echo(message: str):
print(message)
with Workflow(
generate_name="single-script-",
entrypoint="steps",
) as w:
with Steps(name="steps") as s:
echo(name="A", arguments={"message": "I'm a step"})
with s.parallel():
echo(name="B", arguments={"message": "We're steps"})
echo(name="C", arguments={"message": "in parallel!"})
echo(name="D", arguments={"message": "I'm another step!"})
w.create()
from hera.workflows import DAG, Workflow, script
@script()
def echo(message: str):
print(message)
with Workflow(
generate_name="dag-diamond-",
entrypoint="diamond",
) as w:
with DAG(name="diamond"):
A = echo(name="A", arguments={"message": "A"})
B = echo(name="B", arguments={"message": "B"})
C = echo(name="C", arguments={"message": "C"})
D = echo(name="D", arguments={"message": "D"})
A >> [B, C] >> D
w.create()
See the examples for a collection of Argo workflow construction and submission via Hera!
Hera requires an Argo server to be deployed to a Kubernetes cluster. Currently, Hera assumes that the Argo server sits behind an authentication layer that can authenticate workflow submission requests by using the Bearer token on the request. To learn how to deploy Argo to your own Kubernetes cluster you can follow the Argo Workflows guide!
Another option for workflow submission without the authentication layer is using port forwarding to your Argo server
deployment and submitting workflows to localhost:2746
(2746 is the default, but you are free to change it). Please
refer to the documentation of Argo Workflows to see the
command for port forward!
Note Since the deprecation of tokens being automatically created for ServiceAccounts and Argo using Bearer tokens in place, it is necessary to use
--auth=server
and/or--auth=client
when setting up Argo Workflows on Kubernetes v1.24+ in order for hera to communicate to the Argo Server.
There are a few ways to authenticate in Hera - read more in the
authentication walk through - for now, with the
argo
cli tool installed, this example will get you up and running:
from hera.workflows import Workflow, Container
from hera.shared import global_config
from hera.auth import ArgoCLITokenGenerator
global_config.host = "http://localhost:2746"
global_config.token = ArgoCLITokenGenerator
with Workflow(generate_name="local-test-", entrypoint="c") as w:
Container(name="c", image="docker/whalesay", command=["cowsay", "hello"])
w.create()
Source | Command |
---|---|
PyPI | pip install hera |
GitHub repo | python -m pip install git+https://github.com/argoproj-labs/hera --ignore-installed |
Note Hera went through a name change - from
hera-workflows
tohera
. This is reflected in the published Python package. If you'd like to install versions prior to5.0.0
, you should dopip install hera-workflows<5
. Hera currently publishes releases to bothhera
andhera-workflows
for backwards compatibility purposes.
- Install via
hera[yaml]
- PyYAML is required for the
yaml
output format, which is accessible viahera.workflows.Workflow.to_yaml(*args, **kwargs)
. This enables GitOps practices and easier debugging.
- Install via
hera[cli]
. The[cli]
option installs the extra dependency Cappa required for the CLI - The CLI aims to enable GitOps practices, easier debugging, and a more seamless experience with Argo Workflows.
- The CLI is an experimental feature and subject to change! At the moment it only supports generating YAML files
from workflows via
hera generate yaml
. Seehera generate yaml --help
for more information.
- Install via
hera[experimental]
. The[experimental]
option adds dependencies required for experimental features that have not yet graduated into stable features.
- KubeCon/ArgoCon EU 2024 - Orchestrating Python Functions Natively in Argo Using Hera
- CNCF TAG App-Delivery @ KubeCon NA 2023 - Automating the Deployment of Data Workloads to Kubernetes with ArgoCD, Argo Workflows, and Hera
- KubeCon/ArgoCon NA 2023 - How to Train an LLM with Argo Workflows and Hera
- KubeCon/ArgoCon EU 2023 - Scaling gene therapy with Argo Workflows and Hera
- DoKC Town Hall #2 - Unsticking ourselves from Glue - Migrating PayIt's Data Pipelines to Argo Workflows and Hera
- Argo Workflows and Events Community Meeting 15 June 2022 - Hera project update
- Argo Workflows and Events Community Meeting 20 Oct 2021 - Hera introductory presentation
- How To Get the Most out of Hera for Data Science
- Data Validation with Great Expectations and Argo Workflows
- Hera introduction and motivation
- Dyno is scaling gene therapy research with cloud-native tools like Argo Workflows and Hera
See the contributing guide!