Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions docs/apis/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ Kafka exposes all its functionality over a language-independent protocol which h

The Producer API allows applications to send streams of data to topics in the Kafka cluster.

Examples of using the producer are shown in the [javadocs](/43/javadoc/index.html?org/apache/kafka/clients/producer/KafkaProducer.html "Kafka 4.3 Javadoc").
Examples of using the producer are shown in the [javadocs](/{version}/javadoc/index.html?org/apache/kafka/clients/producer/KafkaProducer.html "Kafka 4.3 Javadoc").

To use the producer, add the following Maven dependency to your project:

Expand All @@ -55,7 +55,7 @@ To use the producer, add the following Maven dependency to your project:

The Consumer API allows applications to read streams of data from topics in the Kafka cluster.

Examples of using the consumer are shown in the [javadocs](/43/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaConsumer.html "Kafka 4.3 Javadoc").
Examples of using the consumer are shown in the [javadocs](/{version}/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaConsumer.html "Kafka 4.3 Javadoc").

To use the consumer, add the following Maven dependency to your project:

Expand All @@ -70,7 +70,7 @@ To use the consumer, add the following Maven dependency to your project:

The Share Consumer API enables applications in a share group to cooperatively consume and process data from Kafka topics.

Examples of using the share consumer are shown in the [javadocs](/43/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaShareConsumer.html "Kafka 4.3 Javadoc").
Examples of using the share consumer are shown in the [javadocs](/{version}/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaShareConsumer.html "Kafka 4.3 Javadoc").

To use the share consumer, add the following Maven dependency to your project:

Expand All @@ -85,7 +85,7 @@ To use the share consumer, add the following Maven dependency to your project:

The [Streams](/43/documentation/streams) API allows transforming streams of data from input topics to output topics.

Examples of using this library are shown in the [javadocs](/43/javadoc/index.html?org/apache/kafka/streams/KafkaStreams.html "Kafka 4.3 Javadoc").
Examples of using this library are shown in the [javadocs](/{version}/javadoc/index.html?org/apache/kafka/streams/KafkaStreams.html "Kafka 4.3 Javadoc").

Additional documentation on using the Streams API is available [here](/43/documentation/streams).

Expand Down Expand Up @@ -115,7 +115,7 @@ The Connect API allows implementing connectors that continually pull from some s

Many users of Connect won't need to use this API directly, though, they can use pre-built connectors without needing to write any code. Additional information on using Connect is available [here](/documentation.html#connect).

Those who want to implement custom connectors can see the [javadoc](/43/javadoc/index.html?org/apache/kafka/connect "Kafka 4.3 Javadoc").
Those who want to implement custom connectors can see the [javadoc](/{version}/javadoc/index.html?org/apache/kafka/connect "Kafka 4.3 Javadoc").

# Admin API

Expand All @@ -130,4 +130,4 @@ To use the Admin API, add the following Maven dependency to your project:
<version>4.3.0</version>
</dependency>

For more information about the Admin APIs, see the [javadoc](/43/javadoc/index.html?org/apache/kafka/clients/admin/Admin.html "Kafka 4.3 Javadoc").
For more information about the Admin APIs, see the [javadoc](/{version}/javadoc/index.html?org/apache/kafka/clients/admin/Admin.html "Kafka 4.3 Javadoc").
2 changes: 1 addition & 1 deletion docs/configuration/admin-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ type: docs
-->


Below is the Kafka Admin client library configuration. {{< include-html file="/static/43/generated/admin_client_config.html" >}}
Below is the Kafka Admin client library configuration. {{< include-html file="/static/{version}/generated/admin_client_config.html" >}}
2 changes: 1 addition & 1 deletion docs/configuration/broker-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ The essential configurations are the following:
* `controller.quorum.bootstrap.servers`
* `controller.listener.names`

Broker configurations and defaults are discussed in more detail below. {{< include-html file="/static/43/generated/kafka_config.html" >}}
Broker configurations and defaults are discussed in more detail below. {{< include-html file="/static/{version}/generated/kafka_config.html" >}}

More details about broker configuration can be found in the scala class `kafka.server.KafkaConfig`.

Expand Down
8 changes: 4 additions & 4 deletions docs/configuration/configuration-providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,11 @@ Use configuration providers to load configuration data from external sources. Th

You have the following options:

* Use a custom provider by creating a class implementing the [`ConfigProvider`](/43/javadoc/org/apache/kafka/common/config/provider/ConfigProvider.html) interface and packaging it into a JAR file.
* Use a custom provider by creating a class implementing the [`ConfigProvider`](/{version}/javadoc/org/apache/kafka/common/config/provider/ConfigProvider.html) interface and packaging it into a JAR file.
* Use a built-in provider:
* [`DirectoryConfigProvider`](/43/javadoc/org/apache/kafka/common/config/provider/DirectoryConfigProvider.html)
* [`EnvVarConfigProvider`](/43/javadoc/org/apache/kafka/common/config/provider/EnvVarConfigProvider.html)
* [`FileConfigProvider`](/43/javadoc/org/apache/kafka/common/config/provider/FileConfigProvider.html)
* [`DirectoryConfigProvider`](/{version}/javadoc/org/apache/kafka/common/config/provider/DirectoryConfigProvider.html)
* [`EnvVarConfigProvider`](/{version}/javadoc/org/apache/kafka/common/config/provider/EnvVarConfigProvider.html)
* [`FileConfigProvider`](/{version}/javadoc/org/apache/kafka/common/config/provider/FileConfigProvider.html)



Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/consumer-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ type: docs
-->


Below is the consumer and share consumer configuration: {{< include-html file="/static/43/generated/consumer_config.html" >}}
Below is the consumer and share consumer configuration: {{< include-html file="/static/{version}/generated/consumer_config.html" >}}
2 changes: 1 addition & 1 deletion docs/configuration/group-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ type: docs
-->


Below is the group configuration: {{< include-html file="/static/43/generated/group_config.html" >}}
Below is the group configuration: {{< include-html file="/static/{version}/generated/group_config.html" >}}
6 changes: 3 additions & 3 deletions docs/configuration/kafka-connect-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,12 @@ type: docs
-->


Below is the Kafka Connect framework configuration. {{< include-html file="/static/43/generated/connect_config.html" >}}
Below is the Kafka Connect framework configuration. {{< include-html file="/static/{version}/generated/connect_config.html" >}}

## Source Connector Configs

Below is the source connector configuration. {{< include-html file="/static/43/generated/source_connector_config.html" >}}
Below is the source connector configuration. {{< include-html file="/static/{version}/generated/source_connector_config.html" >}}

## Sink Connector Configs

Below is the sink connector configuration. {{< include-html file="/static/43/generated/sink_connector_config.html" >}}
Below is the sink connector configuration. {{< include-html file="/static/{version}/generated/sink_connector_config.html" >}}
2 changes: 1 addition & 1 deletion docs/configuration/kafka-streams-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ type: docs
-->


Below is the Kafka Streams client library configuration. {{< include-html file="/static/43/generated/streams_config.html" >}}
Below is the Kafka Streams client library configuration. {{< include-html file="/static/{version}/generated/streams_config.html" >}}
8 changes: 4 additions & 4 deletions docs/configuration/mirrormaker-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,16 +30,16 @@ Below is the configuration of the connectors that make up MirrorMaker 2.

## MirrorMaker Common Configs

Below is the common configuration that applies to all three connectors. {{< include-html file="/static/43/generated/mirror_connector_config.html" >}}
Below is the common configuration that applies to all three connectors. {{< include-html file="/static/{version}/generated/mirror_connector_config.html" >}}

## MirrorMaker Source Configs

Below is the configuration of MirrorMaker 2 source connector for replicating topics. {{< include-html file="/static/43/generated/mirror_source_config.html" >}}
Below is the configuration of MirrorMaker 2 source connector for replicating topics. {{< include-html file="/static/{version}/generated/mirror_source_config.html" >}}

## MirrorMaker Checkpoint Configs

Below is the configuration of MirrorMaker 2 checkpoint connector for emitting consumer offset checkpoints. {{< include-html file="/static/43/generated/mirror_checkpoint_config.html" >}}
Below is the configuration of MirrorMaker 2 checkpoint connector for emitting consumer offset checkpoints. {{< include-html file="/static/{version}/generated/mirror_checkpoint_config.html" >}}

## MirrorMaker HeartBeat Configs

Below is the configuration of MirrorMaker 2 heartbeat connector for checking connectivity between connectors and clusters. {{< include-html file="/static/43/generated/mirror_heartbeat_config.html" >}}
Below is the configuration of MirrorMaker 2 heartbeat connector for checking connectivity between connectors and clusters. {{< include-html file="/static/{version}/generated/mirror_heartbeat_config.html" >}}
2 changes: 1 addition & 1 deletion docs/configuration/producer-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ type: docs
-->


Below is the producer configuration: {{< include-html file="/static/43/generated/producer_config.html" >}}
Below is the producer configuration: {{< include-html file="/static/{version}/generated/producer_config.html" >}}
4 changes: 2 additions & 2 deletions docs/configuration/tiered-storage-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,15 +26,15 @@ type: docs
-->


Below is the Tiered Storage configuration. {{< include-html file="/static/43/generated/remote_log_manager_config.html" >}}
Below is the Tiered Storage configuration. {{< include-html file="/static/{version}/generated/remote_log_manager_config.html" >}}

## RLMM Configs

Below is the configuration for `TopicBasedRemoteLogMetadataManager`, which is the default implementation of `RemoteLogMetadataManager`.

All configurations here should start with the prefix defined by `remote.log.metadata.manager.impl.prefix`, for example, `rlmm.config.remote.log.metadata.consume.wait.ms`.

{{< include-html file="/static/43/generated/remote_log_metadata_manager_config.html" >}}
{{< include-html file="/static/{version}/generated/remote_log_metadata_manager_config.html" >}}

The implementation of `TopicBasedRemoteLogMetadataManager` needs to create admin, producer, and consumer clients for the internal topic `__remote_log_metadata`.

Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/topic-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,4 +49,4 @@ To remove an override you can do
$ bin/kafka-configs.sh --bootstrap-server localhost:9092 --entity-type topics --entity-name my-topic
--alter --delete-config max.message.bytes

Below is the topic configuration. The server's default configuration for this property is given under the Server Default Property heading. A given server default config value only applies to a topic if it does not have an explicit topic config override. {{< include-html file="/static/43/generated/topic_config.html" >}}
Below is the topic configuration. The server's default configuration for this property is given under the Server Default Property heading. A given server default config value only applies to a topic if it does not have an explicit topic config override. {{< include-html file="/static/{version}/generated/topic_config.html" >}}
8 changes: 4 additions & 4 deletions docs/design/protocol.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ For interoperability with 0.9.0.x clients, the first packet received by the serv

The protocol is built out of the following primitive types.

{{< include-html file="/static/43/generated/protocol_types.html" >}}
{{< include-html file="/static/{version}/generated/protocol_types.html" >}}

### Notes on reading the request format grammars

Expand Down Expand Up @@ -184,13 +184,13 @@ A description of the record batch format can be found [here](/documentation/#rec

We use numeric codes to indicate what problem occurred on the server. These can be translated by the client into exceptions or whatever the appropriate error handling mechanism in the client language. Here is a table of the error codes currently in use:

{{< include-html file="/static/43/generated/protocol_errors.html" >}}
{{< include-html file="/static/{version}/generated/protocol_errors.html" >}}

### Api Keys

The following are the numeric codes that the stable ApiKey in the request can take for each of the below request types.

{{< include-html file="/static/43/generated/protocol_api_keys.html" >}}
{{< include-html file="/static/{version}/generated/protocol_api_keys.html" >}}

## The Messages

Expand All @@ -204,7 +204,7 @@ The message consists of the header and body:

`RequestOrResponseHeader` is the versioned request or response header. `Body` is the message-specific body.

{{< include-html file="/static/43/generated/protocol_messages.html" >}}
{{< include-html file="/static/{version}/generated/protocol_messages.html" >}}

## Some Common Philosophical Questions

Expand Down
4 changes: 2 additions & 2 deletions docs/getting-started/upgrade.md
Original file line number Diff line number Diff line change
Expand Up @@ -196,14 +196,14 @@ Note: Apache Kafka 4.0 only supports KRaft mode - ZooKeeper mode has been remove
* The `topics.blacklist` was removed from the `org.apache.kafka.connect.mirror.MirrorSourceConfig` Please use `topics.exclude` instead.
* The `groups.blacklist` was removed from the `org.apache.kafka.connect.mirror.MirrorSourceConfig` Please use `groups.exclude` instead.
* **Tools**
* The `kafka.common.MessageReader` class was removed. Please use the [`org.apache.kafka.tools.api.RecordReader`](/43/javadoc/org/apache/kafka/tools/api/RecordReader.html) interface to build custom readers for the `kafka-console-producer` tool.
* The `kafka.common.MessageReader` class was removed. Please use the [`org.apache.kafka.tools.api.RecordReader`](/{version}/javadoc/org/apache/kafka/tools/api/RecordReader.html) interface to build custom readers for the `kafka-console-producer` tool.
* The `kafka.tools.DefaultMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.DefaultMessageFormatter` class instead.
* The `kafka.tools.LoggingMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.LoggingMessageFormatter` class instead.
* The `kafka.tools.NoOpMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.NoOpMessageFormatter` class instead.
* The `--whitelist` option was removed from the `kafka-console-consumer` command line tool. Please use `--include` instead.
* Redirections from the old tools packages have been removed: `kafka.admin.FeatureCommand`, `kafka.tools.ClusterTool`, `kafka.tools.EndToEndLatency`, `kafka.tools.StateChangeLogMerger`, `kafka.tools.StreamsResetter`, `kafka.tools.JmxTool`.
* The `--authorizer`, `--authorizer-properties`, and `--zk-tls-config-file` options were removed from the `kafka-acls` command line tool. Please use `--bootstrap-server` or `--bootstrap-controller` instead.
* The `kafka.serializer.Decoder` trait was removed, please use the [`org.apache.kafka.tools.api.Decoder`](/43/javadoc/org/apache/kafka/tools/api/Decoder.html) interface to build custom decoders for the `kafka-dump-log` tool.
* The `kafka.serializer.Decoder` trait was removed, please use the [`org.apache.kafka.tools.api.Decoder`](/{version}/javadoc/org/apache/kafka/tools/api/Decoder.html) interface to build custom decoders for the `kafka-dump-log` tool.
* The `kafka.coordinator.group.OffsetsMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.OffsetsMessageFormatter` class instead.
* The `kafka.coordinator.group.GroupMetadataMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.GroupMetadataMessageFormatter` class instead.
* The `kafka.coordinator.transaction.TransactionLogMessageFormatter` class was removed. Please use the `org.apache.kafka.tools.consumer.TransactionLogMessageFormatter` class instead.
Expand Down
4 changes: 2 additions & 2 deletions docs/kafka-connect/user-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ Several widely-applicable data and routing transformations are included with Kaf

Details on how to configure each transformation are listed below:

{{< include-html file="/static/43/generated/connect_transforms.html" >}}
{{< include-html file="/static/{version}/generated/connect_transforms.html" >}}

### Predicates

Expand Down Expand Up @@ -249,7 +249,7 @@ Kafka Connect includes the following predicates:

Details on how to configure each predicate are listed below:

{{< include-html file="/static/43/generated/connect_predicates.html" >}}
{{< include-html file="/static/{version}/generated/connect_predicates.html" >}}

## REST API

Expand Down
6 changes: 3 additions & 3 deletions docs/operations/monitoring.md
Original file line number Diff line number Diff line change
Expand Up @@ -3312,7 +3312,7 @@ kafka.producer:type=producer-metrics,client-id=([-.\w]+)

### Producer Sender Metrics

{{< include-html file="/static/43/generated/producer_metrics.html" >}}
{{< include-html file="/static/{version}/generated/producer_metrics.html" >}}

## Consumer monitoring

Expand Down Expand Up @@ -3832,11 +3832,11 @@ kafka.consumer:type=consumer-coordinator-metrics,client-id=([-.\w]+)

### Consumer Fetch Metrics

{{< include-html file="/static/43/generated/consumer_metrics.html" >}}
{{< include-html file="/static/{version}/generated/consumer_metrics.html" >}}

## Connect Monitoring

A Connect worker process contains all the producer and consumer metrics as well as metrics specific to Connect. The worker process itself has a number of metrics, while each connector and task have additional metrics. {{< include-html file="/static/43/generated/connect_metrics.html" >}}
A Connect worker process contains all the producer and consumer metrics as well as metrics specific to Connect. The worker process itself has a number of metrics, while each connector and task have additional metrics. {{< include-html file="/static/{version}/generated/connect_metrics.html" >}}

## Streams Monitoring

Expand Down
2 changes: 1 addition & 1 deletion docs/streams/developer-guide/app-reset-tool.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ All the other parameters can be combined as needed. For example, if you want to
For a complete application reset, you must delete the application's local state directory on any machines where the application instance was run. You must do this before restarting an application instance on the same machine. You can use either of these methods:

* The API method `KafkaStreams#cleanUp()` in your application code.
* Manually delete the corresponding local state directory (default location: `/${java.io.tmpdir}/kafka-streams/<application.id>`). For more information, see [Streams](/43/javadoc/org/apache/kafka/streams/StreamsConfig.html#STATE_DIR_CONFIG) javadocs.
* Manually delete the corresponding local state directory (default location: `/${java.io.tmpdir}/kafka-streams/<application.id>`). For more information, see [Streams](/{version}/javadoc/org/apache/kafka/streams/StreamsConfig.html#STATE_DIR_CONFIG) javadocs.



Expand Down
Loading