This example features a very busy blogging platform, with thousands of messages showing up on your feed.
There are two separate applications (microservices) integrating over a Kafka topic. The producer
generates
thousands of "posts" and publishes them to the topic. The consumer
subscribes to this topic and
displays each post on the standard output.
The consumer has a throttling middleware enabled, so you have a chance to actually read the posts.
To understand the background and internals, see getting started guide.
To run this example you will need Docker and docker-compose installed. See the installation guide.
docker-compose up
You should see the live feed of posts on the standard output.
- Peek into the posts counter published on
posts_count
topic.
docker-compose exec consumer mill kafka consume -b kafka:9092 -t posts_count
- Add a persistent storage for incoming posts in the consumer service, instead of displaying them. Consider using the SQL Publisher.