Skip to content

Commit 1af3c60

Browse files
authored
Merge branch 'main' into version
Signed-off-by: gaobinlong <[email protected]>
2 parents d140d53 + d9a9274 commit 1af3c60

File tree

47 files changed

+702
-404
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

47 files changed

+702
-404
lines changed

.github/workflows/changelog_verifier.yml

Lines changed: 0 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -14,42 +14,5 @@ jobs:
1414
token: ${{ secrets.GITHUB_TOKEN }}
1515
ref: ${{ github.event.pull_request.head.sha }}
1616
- uses: dangoslen/changelog-enforcer@v3
17-
id: verify-changelog-3x
1817
with:
1918
skipLabels: "autocut, skip-changelog"
20-
changeLogPath: 'CHANGELOG-3.0.md'
21-
continue-on-error: true
22-
- uses: dangoslen/changelog-enforcer@v3
23-
id: verify-changelog
24-
with:
25-
skipLabels: "autocut, skip-changelog"
26-
changeLogPath: 'CHANGELOG.md'
27-
continue-on-error: true
28-
- run: |
29-
# The check was possibly skipped leading to success for both the jobs
30-
has_backport_label=${{ contains(join(github.event.pull_request.labels.*.name, ', '), 'backport')}}
31-
has_breaking_label=${{ contains(join(github.event.pull_request.labels.*.name, ', '), '>breaking')}}
32-
if [[ $has_breaking_label == true && $has_backport_label == true ]]; then
33-
echo "error: Please make sure that the PR does not have a backport label associated with it when making breaking changes"
34-
exit 1
35-
fi
36-
37-
if [[ ${{ steps.verify-changelog-3x.outcome }} == 'success' && ${{ steps.verify-changelog.outcome }} == 'success' ]]; then
38-
exit 0
39-
fi
40-
41-
if [[ ${{ steps.verify-changelog-3x.outcome }} == 'failure' && ${{ steps.verify-changelog.outcome }} == 'failure' ]]; then
42-
echo "error: Please ensure a changelog entry exists in CHANGELOG.md or CHANGELOG-3.0.md"
43-
exit 1
44-
fi
45-
46-
# Concatenates the labels and checks if the string contains "backport"
47-
if [[ ${{ steps.verify-changelog.outcome }} == 'success' && $has_backport_label == false ]]; then
48-
echo "error: Please make sure that the PR has a backport label associated with it when making an entry to the CHANGELOG.md file"
49-
exit 1
50-
fi
51-
52-
if [[ ${{ steps.verify-changelog-3x.outcome }} == 'success' && $has_backport_label == true ]]; then
53-
echo "error: Please make sure that the PR does not have a backport label associated with it when making an entry to the CHANGELOG-3.0.md file"
54-
exit 1
55-
fi

CHANGELOG-3.0.md

Lines changed: 0 additions & 21 deletions
This file was deleted.

CHANGELOG.md

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,16 @@ All notable changes to this project are documented in this file.
33

44
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). See the [CONTRIBUTING guide](./CONTRIBUTING.md#Changelog) for instructions on how to add changelog entries.
55

6-
## [Unreleased 2.x]
6+
## [Unreleased 3.x]
77
### Added
8+
- Change priority for scheduling reroute during timeout([#16445](https://github.com/opensearch-project/OpenSearch/pull/16445))
9+
- Renaming the node role search to warm ([#17573](https://github.com/opensearch-project/OpenSearch/pull/17573))
10+
- Introduce a new search node role to hold search only shards ([#17620](https://github.com/opensearch-project/OpenSearch/pull/17620))
811

912
### Dependencies
1013
- Bump `com.google.api:api-common` from 1.8.1 to 2.46.1 ([#17604](https://github.com/opensearch-project/OpenSearch/pull/17604))
14+
- Bump `ch.qos.logback:logback-core` from 1.5.16 to 1.5.17 ([#17609](https://github.com/opensearch-project/OpenSearch/pull/17609))
15+
- Bump `org.jruby.joni:joni` from 2.2.3 to 2.2.5 ([#17608](https://github.com/opensearch-project/OpenSearch/pull/17608))
1116

1217
### Changed
1318

@@ -19,4 +24,4 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
1924

2025
### Security
2126

22-
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/2.19...2.x
27+
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/f58d846f...main

CONTRIBUTING.md

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -146,15 +146,6 @@ Adding in the change is two step process:
146146
1. Add your changes to the corresponding section within the CHANGELOG file with dummy pull request information, publish the PR
147147
2. Update the entry for your change in [`CHANGELOG.md`](CHANGELOG.md) and make sure that you reference the pull request there.
148148

149-
### Where should I put my CHANGELOG entry?
150-
Please review the [branching strategy](https://github.com/opensearch-project/.github/blob/main/RELEASING.md#opensearch-branching) document. The changelog on the `main` branch will contain **two files**: `CHANGELOG.md` which corresponds to unreleased changes intended for the _next minor_ release and `CHANGELOG-3.0.md` which correspond to unreleased changes intended for the _next major_ release. Your entry should go into file corresponding to the version it is intended to be released in. In practice, most changes to `main` will be backported to the next minor release so most entries will be in the `CHANGELOG.md` file.
151-
152-
The following examples assume the _next major_ release on main is 3.0, then _next minor_ release is 2.5, and the _current_ release is 2.4.
153-
154-
- **Add a new feature to release in next minor:** Add a changelog entry to `[Unreleased 2.x]` in CHANGELOG.md on main, then backport to 2.x (including the changelog entry).
155-
- **Introduce a breaking API change to release in next major:** Add a changelog entry to `[Unreleased 3.0]` to CHANGELOG-3.0.md on main, do not backport.
156-
- **Upgrade a dependency to fix a CVE:** Add a changelog entry to `[Unreleased 2.x]` on main, then backport to 2.x (including the changelog entry), then backport to 2.4 and ensure the changelog entry is added to `[Unreleased 2.4.1]`.
157-
158149
## Review Process
159150

160151
We deeply appreciate everyone who takes the time to make a contribution. We will review all contributions as quickly as possible. As a reminder, [opening an issue](https://github.com/opensearch-project/OpenSearch/issues/new/choose) discussing your change before you make it is the best way to smooth the PR process. This will prevent a rejection because someone else is already working on the problem, or because the solution is incompatible with the architectural direction.

libs/grok/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
*/
3030

3131
dependencies {
32-
api 'org.jruby.joni:joni:2.2.3'
32+
api 'org.jruby.joni:joni:2.2.5'
3333
// joni dependencies:
3434
api 'org.jruby.jcodings:jcodings:1.0.63'
3535

libs/grok/licenses/joni-2.2.3.jar.sha1

Lines changed: 0 additions & 1 deletion
This file was deleted.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
4ebafe67efa7395678a34d07e7585bed5ef0cc72

plugins/ingestion-kafka/src/internalClusterTest/java/org/opensearch/plugin/kafka/KafkaIngestionBaseIT.java

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@
3434
import org.testcontainers.utility.DockerImageName;
3535

3636
/**
37-
* Base test class for Kafka ingestion tests
37+
* Base test class for Kafka ingestion tests.
3838
*/
3939
@ThreadLeakFilters(filters = TestContainerThreadLeakFilter.class)
4040
public class KafkaIngestionBaseIT extends OpenSearchIntegTestCase {
@@ -135,6 +135,9 @@ protected void createIndexWithDefaultSettings(int numShards, int numReplicas) {
135135
.put("ingestion_source.param.topic", topicName)
136136
.put("ingestion_source.param.bootstrap_servers", kafka.getBootstrapServers())
137137
.put("index.replication.type", "SEGMENT")
138+
// set custom kafka consumer properties
139+
.put("ingestion_source.param.fetch.min.bytes", 30000)
140+
.put("ingestion_source.param.enable.auto.commit", false)
138141
.build(),
139142
"{\"properties\":{\"name\":{\"type\": \"text\"},\"age\":{\"type\": \"integer\"}}}}"
140143
);

plugins/ingestion-kafka/src/internalClusterTest/java/org/opensearch/plugin/kafka/RemoteStoreKafkaIT.java

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,6 @@ public void testSegmentReplicationWithRemoteStore() throws Exception {
4848
internalCluster().startClusterManagerOnlyNode();
4949
final String nodeA = internalCluster().startDataOnlyNode();
5050
createIndexWithDefaultSettings(1, 1);
51-
5251
ensureYellowAndNoInitializingShards(indexName);
5352
final String nodeB = internalCluster().startDataOnlyNode();
5453
ensureGreen(indexName);

plugins/ingestion-kafka/src/main/java/org/opensearch/plugin/kafka/KafkaPartitionConsumer.java

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@
99
package org.opensearch.plugin.kafka;
1010

1111
import org.apache.kafka.clients.consumer.Consumer;
12-
import org.apache.kafka.clients.consumer.ConsumerConfig;
1312
import org.apache.kafka.clients.consumer.ConsumerRecord;
1413
import org.apache.kafka.clients.consumer.ConsumerRecords;
1514
import org.apache.kafka.clients.consumer.KafkaConsumer;
@@ -99,9 +98,10 @@ protected static Consumer<byte[], byte[]> createConsumer(String clientId, KafkaS
9998
Properties consumerProp = new Properties();
10099
consumerProp.put("bootstrap.servers", config.getBootstrapServers());
101100
consumerProp.put("client.id", clientId);
102-
if (config.getAutoOffsetResetConfig() != null && !config.getAutoOffsetResetConfig().isEmpty()) {
103-
consumerProp.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, config.getAutoOffsetResetConfig());
104-
}
101+
102+
logger.info("Kafka consumer properties for topic {}: {}", config.getTopic(), config.getConsumerConfigurations());
103+
consumerProp.putAll(config.getConsumerConfigurations());
104+
105105
// TODO: why Class org.apache.kafka.common.serialization.StringDeserializer could not be found if set the deserializer as prop?
106106
// consumerProp.put("key.deserializer",
107107
// "org.apache.kafka.common.serialization.StringDeserializer");

0 commit comments

Comments
 (0)