Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/docset.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
project: 'Logstash'

Check notice on line 1 in docs/docset.yml

View workflow job for this annotation

GitHub Actions / docs-preview / build

Substitution key 'esf' is not used in any file

Check notice on line 1 in docs/docset.yml

View workflow job for this annotation

GitHub Actions / docs-preview / build

Substitution key 'ilm-cap' is not used in any file
cross_links:
- beats
- docs-content
- ecs
- elasticsearch
- integration-docs
- logstash-docs
- logstash-docs-md
- search-ui
toc:
- toc: reference
Expand Down
2 changes: 1 addition & 1 deletion docs/extend/codec-new-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -402,7 +402,7 @@ With these both defined, the install process will search for the required jar fi

## Document your plugin [_document_your_plugin_2]

Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs://reference/integration-plugins.md).
Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs-md://vpr/integration-plugins.md).

See [Document your plugin](/extend/plugin-doc.md) for tips and guidelines.

Expand Down
2 changes: 1 addition & 1 deletion docs/extend/filter-new-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -403,7 +403,7 @@ With these both defined, the install process will search for the required jar fi

## Document your plugin [_document_your_plugin_3]

Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs://reference/integration-plugins.md).
Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs-md://vpr/integration-plugins.md).

See [Document your plugin](/extend/plugin-doc.md) for tips and guidelines.

Expand Down
2 changes: 1 addition & 1 deletion docs/extend/input-new-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -443,7 +443,7 @@ With these both defined, the install process will search for the required jar fi

## Document your plugin [_document_your_plugin]

Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs://reference/integration-plugins.md).
Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs-md://vpr/integration-plugins.md).

See [Document your plugin](/extend/plugin-doc.md) for tips and guidelines.

Expand Down
2 changes: 1 addition & 1 deletion docs/extend/java-filter-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ Finally, we come to the `filter` method that is invoked by the Logstash executio

In the example above, the value of the `source` field is retrieved from each event and reversed if it is a string value. Because each event is mutated in place, the incoming `events` collection can be returned.

The `matchListener` is the mechanism by which filters indicate which events "match". The common actions for filters such as `add_field` and `add_tag` are applied only to events that are designated as "matching". Some filters such as the [grok filter](/reference/plugins-filters-grok.md) have a clear definition for what constitutes a matching event and will notify the listener only for matching events. Other filters such as the [UUID filter](/reference/plugins-filters-uuid.md) have no specific match criteria and should notify the listener for every event filtered. In this example, the filter notifies the match listener for any event that had a `String` value in its `source` field and was therefore able to be reversed.
The `matchListener` is the mechanism by which filters indicate which events "match". The common actions for filters such as `add_field` and `add_tag` are applied only to events that are designated as "matching". Some filters such as the [grok filter](logstash-docs-md://lsr/plugins-filters-grok.md) have a clear definition for what constitutes a matching event and will notify the listener only for matching events. Other filters such as the [UUID filter](logstash-docs-md://lsr/plugins-filters-uuid.md) have no specific match criteria and should notify the listener for every event filtered. In this example, the filter notifies the match listener for any event that had a `String` value in its `source` field and was therefore able to be reversed.


### getId method [_getid_method_3]
Expand Down
2 changes: 1 addition & 1 deletion docs/extend/output-new-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -360,7 +360,7 @@ With these both defined, the install process will search for the required jar fi

## Document your plugin [_document_your_plugin_4]

Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs://reference/integration-plugins.md).
Documentation is an important part of your plugin. All plugin documentation is rendered and placed in the [Logstash Reference](/reference/index.md) and the [Versioned plugin docs](logstash-docs-md://vpr/integration-plugins.md).

See [Document your plugin](/extend/plugin-doc.md) for tips and guidelines.

Expand Down
10 changes: 5 additions & 5 deletions docs/extend/plugin-doc.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ mapped_pages:

Documentation is a required component of your plugin. Quality documentation with good examples contributes to the adoption of your plugin.

The documentation that you write for your plugin will be generated and published in the [Logstash Reference](/reference/index.md) and the [Logstash Versioned Plugin Reference](logstash-docs://reference/integration-plugins.md).
The documentation that you write for your plugin will be generated and published in the [Logstash Reference](/reference/index.md) and the [Logstash Versioned Plugin Reference](logstash-docs-md://vpr/integration-plugins.md).

::::{admonition} Plugin listing in {{ls}} Reference
:class: note
Expand All @@ -26,7 +26,7 @@ Documentation belongs in a single file called *docs/index.asciidoc*. It belongs

## Heading IDs [heading-ids]

Format heading anchors with variables that can support generated IDs. This approach creates unique IDs when the [Logstash Versioned Plugin Reference](logstash-docs://reference/integration-plugins.md) is built. Unique heading IDs are required to avoid duplication over multiple versions of a plugin.
Format heading anchors with variables that can support generated IDs. This approach creates unique IDs when the [Logstash Versioned Plugin Reference](logstash-docs-md://vpr/integration-plugins.md) is built. Unique heading IDs are required to avoid duplication over multiple versions of a plugin.

**Example**

Expand All @@ -39,7 +39,7 @@ Instead, use variables to define it:
==== Configuration models
```

If you hardcode an ID, the [Logstash Versioned Plugin Reference](logstash-docs://reference/integration-plugins.md) builds correctly the first time. The second time the doc build runs, the ID is flagged as a duplicate, and the build fails.
If you hardcode an ID, the [Logstash Versioned Plugin Reference](logstash-docs-md://vpr/integration-plugins.md) builds correctly the first time. The second time the doc build runs, the ID is flagged as a duplicate, and the build fails.


## Link formats [link-format]
Expand Down Expand Up @@ -136,7 +136,7 @@ match => {

## Where’s my doc? [_wheres_my_doc]

Plugin documentation goes through several steps before it gets published in the [Logstash Versioned Plugin Reference](logstash-docs://reference/integration-plugins.md) and the [Logstash Reference](/reference/index.md).
Plugin documentation goes through several steps before it gets published in the [Logstash Versioned Plugin Reference](logstash-docs-md://vpr/integration-plugins.md) and the [Logstash Reference](/reference/index.md).

Here’s an overview of the workflow:

Expand All @@ -145,7 +145,7 @@ Here’s an overview of the workflow:
* Wait for the continuous integration build to complete successfully.
* Publish the plugin to [https://rubygems.org](https://rubygems.org).
* A script detects the new or changed version, and picks up the `index.asciidoc` file for inclusion in the doc build.
* The documentation for your new plugin is published in the [Logstash Versioned Plugin Reference](logstash-docs://reference/integration-plugins.md).
* The documentation for your new plugin is published in the [Logstash Versioned Plugin Reference](logstash-docs-md://vpr/integration-plugins.md).

We’re not done yet.

Expand Down
8 changes: 4 additions & 4 deletions docs/reference/advanced-pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@ To get started, go [here](https://download.elastic.co/demos/logstash/gettingstar

## Configuring Filebeat to Send Log Lines to Logstash [configuring-filebeat]

Before you create the Logstash pipeline, you’ll configure Filebeat to send log lines to Logstash. The [Filebeat](https://github.com/elastic/beats/tree/main/filebeat) client is a lightweight, resource-friendly tool that collects logs from files on the server and forwards these logs to your Logstash instance for processing. Filebeat is designed for reliability and low latency. Filebeat has a light resource footprint on the host machine, and the [`Beats input`](/reference/plugins-inputs-beats.md) plugin minimizes the resource demands on the Logstash instance.
Before you create the Logstash pipeline, you’ll configure Filebeat to send log lines to Logstash. The [Filebeat](https://github.com/elastic/beats/tree/main/filebeat) client is a lightweight, resource-friendly tool that collects logs from files on the server and forwards these logs to your Logstash instance for processing. Filebeat is designed for reliability and low latency. Filebeat has a light resource footprint on the host machine, and the [`Beats input`](logstash-docs-md://lsr/plugins-inputs-beats.md) plugin minimizes the resource demands on the Logstash instance.

::::{note}
In a typical use case, Filebeat runs on a separate machine from the machine running your Logstash instance. For the purposes of this tutorial, Logstash and Filebeat are running on the same machine.
::::


The default Logstash installation includes the [`Beats input`](/reference/plugins-inputs-beats.md) plugin. The Beats input plugin enables Logstash to receive events from the Elastic Beats framework, which means that any Beat written to work with the Beats framework, such as Packetbeat and Metricbeat, can also send event data to Logstash.
The default Logstash installation includes the [`Beats input`](logstash-docs-md://lsr/plugins-inputs-beats.md) plugin. The Beats input plugin enables Logstash to receive events from the Elastic Beats framework, which means that any Beat written to work with the Beats framework, such as Packetbeat and Metricbeat, can also send event data to Logstash.

To install Filebeat on your data source machine, download the appropriate package from the Filebeat [product page](https://www.elastic.co/downloads/beats/filebeat). You can also refer to [Filebeat quick start](beats://reference/filebeat/filebeat-installation-configuration.md) for additional installation instructions.

Expand Down Expand Up @@ -163,7 +163,7 @@ If your pipeline is working correctly, you should see a series of events like th

Now you have a working pipeline that reads log lines from Filebeat. However you’ll notice that the format of the log messages is not ideal. You want to parse the log messages to create specific, named fields from the logs. To do this, you’ll use the `grok` filter plugin.

The [`grok`](/reference/plugins-filters-grok.md) filter plugin is one of several plugins that are available by default in Logstash. For details on how to manage Logstash plugins, see the [reference documentation](/reference/working-with-plugins.md) for the plugin manager.
The [`grok`](logstash-docs-md://lsr/plugins-filters-grok.md) filter plugin is one of several plugins that are available by default in Logstash. For details on how to manage Logstash plugins, see the [reference documentation](/reference/working-with-plugins.md) for the plugin manager.

The `grok` filter plugin enables you to parse the unstructured log data into something structured and queryable.

Expand Down Expand Up @@ -305,7 +305,7 @@ Notice that the event includes the original message, but the log message is also

### Enhancing Your Data with the Geoip Filter Plugin [configuring-geoip-plugin]

In addition to parsing log data for better searches, filter plugins can derive supplementary information from existing data. As an example, the [`geoip`](/reference/plugins-filters-geoip.md) plugin looks up IP addresses, derives geographic location information from the addresses, and adds that location information to the logs.
In addition to parsing log data for better searches, filter plugins can derive supplementary information from existing data. As an example, the [`geoip`](logstash-docs-md://lsr/plugins-filters-geoip.md) plugin looks up IP addresses, derives geographic location information from the addresses, and adds that location information to the logs.

Configure your Logstash instance to use the `geoip` filter plugin by adding the following lines to the `filter` section of the `first-pipeline.conf` file:

Expand Down
67 changes: 0 additions & 67 deletions docs/reference/codec-plugins.md

This file was deleted.

4 changes: 2 additions & 2 deletions docs/reference/configuration-file-structure.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ input {

In this example, two settings are configured for each of the file inputs: *port* and *tags*.

The settings you can configure vary according to the plugin type. For information about each plugin, see [Input Plugins](/reference/input-plugins.md), [Output Plugins](/reference/output-plugins.md), [Filter Plugins](/reference/filter-plugins.md), and [Codec Plugins](/reference/codec-plugins.md).
The settings you can configure vary according to the plugin type. For information about each plugin, see [Input Plugins](logstash-docs-md://lsr/input-plugins.md), [Output Plugins](logstash-docs-md://lsr/output-plugins.md), [Filter Plugins](logstash-docs-md://lsr/filter-plugins.md), and [Codec Plugins](logstash-docs-md://lsr/codec-plugins.md).


## Value types [plugin-value-types]
Expand Down Expand Up @@ -113,7 +113,7 @@ A codec is the name of Logstash codec used to represent the data. Codecs can be

Input codecs provide a convenient way to decode your data before it enters the input. Output codecs provide a convenient way to encode your data before it leaves the output. Using an input or output codec eliminates the need for a separate filter in your Logstash pipeline.

A list of available codecs can be found at the [Codec Plugins](/reference/codec-plugins.md) page.
A list of available codecs can be found at the [Codec Plugins](logstash-docs-md://lsr/codec-plugins.md) page.

Example:

Expand Down
Loading