-
Notifications
You must be signed in to change notification settings - Fork 449
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[O11y][Apache Spark] Add dimension mapping for driver datastream #8111
[O11y][Apache Spark] Add dimension mapping for driver datastream #8111
Conversation
🌐 Coverage report
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
@harnish-elastic , please complete the peer review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…into apache_spark-driver-add-dimension Conflicts: packages/apache_spark/changelog.yml packages/apache_spark/manifest.yml
Package apache_spark - 0.7.2 containing this change is available at https://epr.elastic.co/search?package=apache_spark |
What does this PR do?
This PR adds dimension fields for the driver data stream to support TSDB enablement.
Checklist
I have reviewed tips for building integrations and this pull request is aligned with them.
I have verified that all data streams collect metrics or logs.
I have added an entry to my package's
changelog.yml
file.I have verified that Kibana version constraints are current according to guidelines.
Relates Apache Spark TSDB Enablement #7786