You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.adoc
+63-3
Original file line number
Diff line number
Diff line change
@@ -2074,8 +2074,9 @@ A S3 sink connector will be created with this link:kafka-connect-sink-s3/config/
2074
2074
}
2075
2075
----
2076
2076
2077
-
Sink will read messages from topic _gaming-player-activity_ and store in S3 bucket _gaming-player-activity-bucket_.
2078
-
Sink will generate a new object storage entry every 100 messages (_flush_size_).
2077
+
Sink connector will read messages from topic _gaming-player-activity_ and store in S3 bucket _gaming-player-activity-bucket_ using _io.confluent.connect.s3.format.avro.AvroFormat_ as format class.
2078
+
2079
+
Sink connector will generate a new object storage entry every 100 messages (_flush_size_).
2079
2080
2080
2081
To generate random records for topic _gaming-player-activity_ we will use link:https://github.com/ugol/jr[jr] tool.
2081
2082
@@ -2086,7 +2087,7 @@ Send 1000 messages to _gaming-player-activity_ topic using jr:
2086
2087
docker exec -it -w /home/jr/.jr jr jr template run gaming_player_activity -n 1000 -o kafka -t gaming-player-activity -s --serializer avro-generic
2087
2088
----
2088
2089
2089
-
Verify that 10 entries are stored in Minio into _gaming-player-activity-bucket_ bucket, connecting to MiniIO web console, http://localhost:9000 (admin/minioadmin):
2090
+
Verify that 10 entries are stored in MinIO into _gaming-player-activity-bucket_ bucket, connecting to MiniIO web console, http://localhost:9000 (admin/minioadmin):
Same example but Sink connector will read Avro messages from topic _gaming-player-activity_ and store them in S3 bucket _gaming-player-activity-bucket_ using _io.confluent.connect.s3.format.parquet.ParquetFormat_ as format class.
2104
+
2105
+
The format of data stored in MinIO will be Parquet.
2106
+
2107
+
Run the example:
2108
+
2109
+
[source,bash]
2110
+
----
2111
+
scripts/bootstrap-connect-sink-s3-parquet.sh
2112
+
----
2113
+
2114
+
A S3 sink connector will be created with this link:kafka-connect-sink-s3/config/s3_parquet_sink.json[config]:
Send 1000 messages to _gaming-player-activity_ topic using jr:
2143
+
2144
+
[source,bash]
2145
+
----
2146
+
docker exec -it -w /home/jr/.jr jr jr template run gaming_player_activity -n 1000 -o kafka -t gaming-player-activity -s --serializer avro-generic
2147
+
----
2148
+
2149
+
Verify that 10 entries are stored in MinIO into _gaming-player-activity-bucket_ bucket, connecting to MiniIO web console, http://localhost:9000 (admin/minioadmin):
0 commit comments