Skip to content

Commit

Permalink
feat: more work on Python DLS
Browse files Browse the repository at this point in the history
  • Loading branch information
alexec committed Jun 8, 2021
1 parent e306739 commit 8170a88
Show file tree
Hide file tree
Showing 27 changed files with 46 additions and 46 deletions.
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ config/stan/single-server-stan.yml:
examples: $(shell find examples -name '*-pipeline.yaml' | sort) docs/EXAMPLES.md

examples/%-pipeline.yaml: examples/%-pipeline.py dsls/python/__init__.py
PYTHONPATH=. python3 examples/$*-pipeline.py
cd examples && PYTHONPATH=.. python3 $*-pipeline.py

.PHONY: test-examples
test-examples: examples
Expand Down
40 changes: 20 additions & 20 deletions docs/EXAMPLES.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ It uses a cron schedule as a source and then just cat the message to a log
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/101-hello-pipeline.yaml
```

### [two-node](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/101-two-node-pipeline.yaml)
### [101-two-node](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/101-two-node-pipeline.yaml)

This example shows a example of having two nodes in a pipeline.

Expand All @@ -20,7 +20,7 @@ While they read from Kafka, they are connected by a NATS Streaming subject.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/101-two-node-pipeline.yaml
```

### [two-node](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-filter-pipeline.yaml)
### [102-filter](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-filter-pipeline.yaml)

This is an example of built-in filtering.

Expand All @@ -34,15 +34,15 @@ They have a single variable, `msg`, which is a byte array.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-filter-pipeline.yaml
```

### [flatten-expand](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-flatten-expand-pipeline.yaml)
### [102-flatten-expand](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-flatten-expand-pipeline.yaml)

This is an example of built-in flattening and expanding.

```
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-flatten-expand-pipeline.yaml
```

### [map](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-map-pipeline.yaml)
### [102-map](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-map-pipeline.yaml)

This is an example of built-in mapping.

Expand All @@ -56,7 +56,7 @@ They have a single variable, `msg`, which is a byte array.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/102-map-pipeline.yaml
```

### [autoscaling](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-autoscaling-pipeline.yaml)
### [103-autoscaling](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-autoscaling-pipeline.yaml)

This is an example of having multiple replicas for a single step.

Expand Down Expand Up @@ -92,7 +92,7 @@ of replicas re-calculated.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-autoscaling-pipeline.yaml
```

### [scaling](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-scaling-pipeline.yaml)
### [103-scaling](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-scaling-pipeline.yaml)

This is an example of having multiple replicas for a single step.

Expand All @@ -106,7 +106,7 @@ kubectl scale step/scaling-main --replicas 3
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/103-scaling-pipeline.yaml
```

### [go1-16](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-go1-16-pipeline.yaml)
### [104-go1-16](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-go1-16-pipeline.yaml)

This example of Go 1.16 handler.

Expand All @@ -116,7 +116,7 @@ This example of Go 1.16 handler.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-go1-16-pipeline.yaml
```

### [java-16](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-java16-pipeline.yaml)
### [104-java16](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-java16-pipeline.yaml)

This example is of the Java 16 handler.

Expand All @@ -126,7 +126,7 @@ This example is of the Java 16 handler.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-java16-pipeline.yaml
```

### [python3-9](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-python3-9-pipeline.yaml)
### [104-python3-9](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-python3-9-pipeline.yaml)

This example is of the Python 3.9 handler.

Expand All @@ -136,7 +136,7 @@ This example is of the Python 3.9 handler.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/104-python3-9-pipeline.yaml
```

### [git](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/106-git-pipeline.yaml)
### [106-git](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/106-git-pipeline.yaml)

This example of a pipeline using Git.

Expand All @@ -149,7 +149,7 @@ your code when the step starts.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/106-git-pipeline.yaml
```

### [completion](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/107-completion-pipeline.yaml)
### [107-completion](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/107-completion-pipeline.yaml)

This example shows a pipeline running to completion.

Expand Down Expand Up @@ -182,7 +182,7 @@ This example showcases container options.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/108-container-pipeline.yaml
```

### [fifos](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/108-fifos-pipeline.yaml)
### [108-fifos](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/108-fifos-pipeline.yaml)

This example use named pipe to send and receive messages.

Expand All @@ -197,7 +197,7 @@ You MUST escape new lines.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/108-fifos-pipeline.yaml
```

### [group](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/109-group-pipeline.yaml)
### [109-group](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/109-group-pipeline.yaml)

This is an example of built-in grouping.

Expand All @@ -223,15 +223,15 @@ Storage can either be:
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/109-group-pipeline.yaml
```

### [vet](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-vetinary-pipeline.yaml)
### [201-vetinary](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-vetinary-pipeline.yaml)

This pipeline processes pets (cats and dogs).

```
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-vetinary-pipeline.yaml
```

### [word-count](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-word-count-pipeline.yaml)
### [201-word-count](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-word-count-pipeline.yaml)

This pipeline count the number of words in a document, not the number of count of each word as you might expect.

Expand All @@ -241,7 +241,7 @@ This pipeline count the number of words in a document, not the number of count o
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/201-word-count-pipeline.yaml
```

### [cron-log](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-cron-log-pipeline.yaml)
### [301-cron-log](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-cron-log-pipeline.yaml)

This example uses a cron source and a log sink.

Expand Down Expand Up @@ -274,7 +274,7 @@ This example creates errors randomly
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-erroring-pipeline.yaml
```

### [http](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-http-pipeline.yaml)
### [301-http](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-http-pipeline.yaml)

This example uses a HTTP sources and sinks.

Expand All @@ -288,23 +288,23 @@ messages between steps.
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-http-pipeline.yaml
```

### [kafka](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-kafka-pipeline.yaml)
### [301-kafka](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-kafka-pipeline.yaml)

This example shows reading and writing to a Kafka topic

```
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-kafka-pipeline.yaml
```

### [parallel](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-parallel-pipeline.yaml)
### [301-parallel](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-parallel-pipeline.yaml)

This example uses parallel to 2x the amount of data it processes.

```
kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-parallel-pipeline.yaml
```

### [stan](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-stan-pipeline.yaml)
### [301-stan](https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/301-stan-pipeline.yaml)

This example shows reading and writing to a STAN subject

Expand Down
2 changes: 1 addition & 1 deletion examples/101-two-node-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
This example shows a example of having two nodes in a pipeline.
While they read from Kafka, they are connected by a NATS Streaming subject.
name: two-node
name: 101-two-node
spec:
steps:
- cat: {}
Expand Down
2 changes: 1 addition & 1 deletion examples/102-filter-pipeline.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dsls.python import kafka, pipeline

if __name__ == '__main__':
(pipeline("102-two-node")
(pipeline("102-filter")
.describe("""This is an example of built-in filtering.
Filters are written using expression syntax and must return a boolean.
Expand Down
2 changes: 1 addition & 1 deletion examples/102-filter-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ metadata:
They have a single variable, `msg`, which is a byte array.
[Learn about expressions](../docs/EXPRESSIONS.md)
name: two-node
name: 102-filter
spec:
steps:
- filter: |-
Expand Down
2 changes: 1 addition & 1 deletion examples/102-flatten-expand-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ metadata:
annotations:
dataflow.argoproj.io/description: This is an example of built-in flattening and
expanding.
name: flatten-expand
name: 102-flatten-expand
spec:
steps:
- map: |-
Expand Down
2 changes: 1 addition & 1 deletion examples/102-map-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ metadata:
They have a single variable, `msg`, which is a byte array.
[Learn about expressions](../docs/EXPRESSIONS.md)
name: map
name: 102-map
spec:
steps:
- map: |-
Expand Down
2 changes: 1 addition & 1 deletion examples/103-autoscaling-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ metadata:
You can scale to zero by setting `minReplicas: 0`. The number of replicas will start at zero, and periodically be scaled
to 1 so it can "peek" the the message queue. The number of pending messages is measured and the target number
of replicas re-calculated.
name: autoscaling
name: 103-autoscaling
spec:
steps:
- cat: {}
Expand Down
2 changes: 1 addition & 1 deletion examples/103-scaling-pipeline.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dsls.python import pipeline, kafka

if __name__ == '__main__':
(pipeline("scaling")
(pipeline("103-scaling")
.describe(""" This is an example of having multiple replicas for a single step.
Steps can be manually scaled using `kubectl`:
Expand Down
2 changes: 1 addition & 1 deletion examples/103-scaling-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ metadata:
```
kubectl scale step/scaling-main --replicas 3
```
name: scaling
name: 103-scaling
spec:
steps:
- cat: {}
Expand Down
2 changes: 1 addition & 1 deletion examples/104-go1-16-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
This example of Go 1.16 handler.
[Learn about handlers](../docs/HANDLERS.md)
name: go1-16
name: 104-go1-16
spec:
steps:
- handler:
Expand Down
2 changes: 1 addition & 1 deletion examples/104-java16-pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ def handler(msg):


if __name__ == '__main__':
(pipeline("104-java-16")
(pipeline("104-java16")
.describe("""This example is of the Java 16 handler.
[Learn about handlers](../docs/HANDLERS.md)""")
Expand Down
2 changes: 1 addition & 1 deletion examples/104-java16-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
This example is of the Java 16 handler.
[Learn about handlers](../docs/HANDLERS.md)
name: java-16
name: 104-java16
spec:
steps:
- handler:
Expand Down
2 changes: 1 addition & 1 deletion examples/104-python3-9-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ metadata:
[Learn about handlers](../docs/HANDLERS.md)
dataflow.argoproj.io/timeout: 2m
name: python3-9
name: 104-python3-9
spec:
steps:
- handler:
Expand Down
2 changes: 1 addition & 1 deletion examples/106-git-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ metadata:
your code when the step starts.
[Learn about Git steps](../docs/GIT.md)
name: git
name: 106-git
spec:
steps:
- git:
Expand Down
2 changes: 1 addition & 1 deletion examples/107-completion-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ metadata:
* Every steps exits successfully (i.e. with exit code 0).
* One step exits successfully, and is marked with `terminator: true`. When this happens, all other steps are killed.
dataflow.argoproj.io/wait-for: Completed
name: completion
name: 107-completion
spec:
steps:
- container:
Expand Down
2 changes: 1 addition & 1 deletion examples/108-fifos-pipeline.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dsls.python import pipeline, kafka

if __name__ == '__main__':
(pipeline("fifos")
(pipeline("108-fifos")
.describe("""This example use named pipe to send and receive messages.
Two named pipes are made available:
Expand Down
2 changes: 1 addition & 1 deletion examples/108-fifos-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ metadata:
* The contain can write to `/var/run/argo-dataflow/out`. Each line MUST be a single message.
You MUST escape new lines.
name: fifos
name: 108-fifos
spec:
steps:
- container:
Expand Down
2 changes: 1 addition & 1 deletion examples/109-group-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ metadata:
* An ephemeral volume - you don't mind loosing some or all messages (e.g. development or pre-production).
* A persistent volume - you want to be to recover (e.g. production).
name: group
name: 109-group
spec:
steps:
- group:
Expand Down
2 changes: 1 addition & 1 deletion examples/201-vetinary-pipeline.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dsls.python import pipeline, kafka, stan

if __name__ == '__main__':
(pipeline("201-vet")
(pipeline("201-vetinary")
.describe("""This pipeline processes pets (cats and dogs).""")
.annotate("dataflow.argoproj.io/test", "false")
.annotate("dataflow.argoproj.io/needs", "pets-configmap.yaml")
Expand Down
2 changes: 1 addition & 1 deletion examples/201-vetinary-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
dataflow.argoproj.io/description: This pipeline processes pets (cats and dogs).
dataflow.argoproj.io/needs: pets-configmap.yaml
dataflow.argoproj.io/test: 'false'
name: vet
name: 201-vetinary
spec:
steps:
- container:
Expand Down
2 changes: 1 addition & 1 deletion examples/201-word-count-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ metadata:
It also shows an example of a pipelines terminates based on a single step's status.
dataflow.argoproj.io/needs: word-count-input-configmap.yaml
dataflow.argoproj.io/test: 'false'
name: word-count
name: 201-word-count
spec:
steps:
- container:
Expand Down
2 changes: 1 addition & 1 deletion examples/301-cron-log-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ metadata:
## Log
This logs the message.
name: cron-log
name: 301-cron-log
spec:
steps:
- cat: {}
Expand Down
2 changes: 1 addition & 1 deletion examples/301-http-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ metadata:
\ *unreliable* because it is possible for messages to not get delivered when\
\ the receiving service is down.\n"
dataflow.argoproj.io/test: 'true'
name: http
name: 301-http
spec:
steps:
- cat: {}
Expand Down
2 changes: 1 addition & 1 deletion examples/301-kafka-pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
dataflow.argoproj.io/description: This example shows reading and writing to a
Kafka topic
dataflow.argoproj.io/test: 'true'
name: kafka
name: 301-kafka
spec:
steps:
- cat: {}
Expand Down
Loading

0 comments on commit 8170a88

Please sign in to comment.