From e951192166b3159956e299f73dd27394a49cace0 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Tue, 28 Mar 2023 14:43:26 -0700 Subject: [PATCH 01/17] Support pagination in V2 engine, phase 1 (#226) * Fixing integration tests broken during POC Signed-off-by: MaxKsyunz * Comment to clarify an exception. Signed-off-by: MaxKsyunz * Add support for paginated scroll request, first page. Implement PaginatedPlanCache.convertToPlan for second page to work. Signed-off-by: MaxKsyunz * Progress on paginated scroll request, subsequent page. Signed-off-by: MaxKsyunz * Move `ExpressionSerializer` from `opensearch` to `core`. Signed-off-by: Yury-Fridlyand * Rename `Cursor` `asString` to `toString`. Signed-off-by: Yury-Fridlyand * Disable scroll cleaning. Signed-off-by: Yury-Fridlyand * Add full cursor serialization and deserialization. Signed-off-by: Yury-Fridlyand * Misc fixes. Signed-off-by: Yury-Fridlyand * Further work on pagination. * Added push down page size from `LogicalPaginate` to `LogicalRelation`. * Improved cursor encoding and decoding. * Added cursor compression. * Fixed issuing `SearchScrollRequest`. * Fixed returning last empty page. * Minor code grooming/commenting. Signed-off-by: Yury-Fridlyand * Pagination fix for empty indices. Signed-off-by: Yury-Fridlyand * Fix error reporting on wrong cursor. Signed-off-by: Yury-Fridlyand * Minor comments and error reporting improvement. Signed-off-by: Yury-Fridlyand * Add an end-to-end integration test. Signed-off-by: Yury-Fridlyand * Add `explain` request handlers. Signed-off-by: Yury-Fridlyand * Add IT for explain. Signed-off-by: Yury-Fridlyand * Address issues flagged by checkstyle build step (#229) Signed-off-by: MaxKsyunz * Pagination, phase 1: Add unit tests for `:core` module with coverage. (#230) * Add unit tests for `:core` module with coverage. Uncovered: `toCursor`, because it is will be changed soon. Signed-off-by: Yury-Fridlyand * Pagination, phase 1: Add unit tests for SQL module with coverage. (#239) * Add unit tests for SQL module with coverage. Signed-off-by: Yury-Fridlyand * Update sql/src/main/java/org/opensearch/sql/sql/domain/SQLQueryRequest.java Signed-off-by: Yury-Fridlyand Co-authored-by: GabeFernandez310 --------- Signed-off-by: Yury-Fridlyand Co-authored-by: GabeFernandez310 * Pagination, phase 1: Add unit tests for `:opensearch` module with coverage. (#233) * Add UT for `:opensearch` module with full coverage, except `toCursor`. Signed-off-by: Yury-Fridlyand * Fix checkstyle. Signed-off-by: Yury-Fridlyand --------- Signed-off-by: Yury-Fridlyand * Fix the merges. Signed-off-by: Yury-Fridlyand * Fix explain. Signed-off-by: Yury-Fridlyand * Fix scroll cleaning. Signed-off-by: Yury-Fridlyand * Store `TotalHits` and use it to report `total` in response. Signed-off-by: Yury-Fridlyand * Add missing UT for `:protocol` module. Signed-off-by: Yury-Fridlyand * Fix PPL UTs damaged in f4ea4ad8c. Signed-off-by: Yury-Fridlyand * Minor checkstyle fixes. Signed-off-by: Yury-Fridlyand * Fallback to v1 engine for pagination (#245) * Pagination fallback integration tests. Signed-off-by: MaxKsyunz * Add UT with coverage for `toCursor` serialization. Signed-off-by: Yury-Fridlyand * Fix broken tests in `legacy`. Signed-off-by: Yury-Fridlyand * Fix getting `total` from non-paged requests and from queries without `FROM` clause. Signed-off-by: Yury-Fridlyand * Fix scroll cleaning. Signed-off-by: Yury-Fridlyand * Fix cursor request processing. Signed-off-by: Yury-Fridlyand * Update ITs. Signed-off-by: Yury-Fridlyand * Fix (again) TotalHits feature. Signed-off-by: Yury-Fridlyand * Fix typo in prometheus config. Signed-off-by: Yury-Fridlyand * Recover commented logging. Signed-off-by: Yury-Fridlyand * Move `test_pagination_blackbox` to a separate class and add logging. Signed-off-by: Yury-Fridlyand * Address some PR feedbacks: rename some classes and revert unnecessary whitespace changed. Signed-off-by: Yury-Fridlyand * Minor commenting. Signed-off-by: Yury-Fridlyand * Address PR comments. * Add javadocs * Renames * Cleaning up some comments * Remove unused code * Speed up IT Signed-off-by: Yury-Fridlyand * Minor missing changes. Signed-off-by: Yury-Fridlyand * Integration tests for fetch_size, max_result_window, and query.size_limit (#248) Signed-off-by: MaxKsyunz * Remove `PaginatedQueryService`, extend `QueryService` to hold two planners and use them. Signed-off-by: Yury-Fridlyand * Move push down functions from request builders to a new interface. Signed-off-by: Yury-Fridlyand * Some file moves. Signed-off-by: Yury-Fridlyand * Minor clean-up according to PR review. Signed-off-by: Yury-Fridlyand --------- Signed-off-by: MaxKsyunz Signed-off-by: Yury-Fridlyand Co-authored-by: MaxKsyunz Co-authored-by: GabeFernandez310 Co-authored-by: Max Ksyunz --- core/build.gradle | 1 + .../org/opensearch/sql/analysis/Analyzer.java | 8 + .../sql/ast/AbstractNodeVisitor.java | 5 + .../opensearch/sql/ast/statement/Query.java | 1 + .../org/opensearch/sql/ast/tree/Paginate.java | 48 ++ .../datasource/model/DataSourceMetadata.java | 3 - .../UnsupportedCursorRequestException.java | 12 + .../sql/executor/ExecutionEngine.java | 3 + .../org/opensearch/sql/executor/QueryId.java | 1 + .../opensearch/sql/executor/QueryService.java | 20 +- .../execution/ContinuePaginatedPlan.java | 59 +++ .../sql/executor/execution/PaginatedPlan.java | 53 ++ .../executor/execution/QueryPlanFactory.java | 46 +- .../pagination/CanPaginateVisitor.java | 65 +++ .../sql/executor/pagination/Cursor.java | 29 ++ .../pagination/PaginatedPlanCache.java | 161 ++++++ .../DefaultExpressionSerializer.java | 2 +- .../serialization/ExpressionSerializer.java | 2 +- .../sql/planner/DefaultImplementor.java | 8 +- .../sql/planner/logical/LogicalPaginate.java | 31 ++ .../logical/LogicalPlanNodeVisitor.java | 4 + .../sql/planner/logical/LogicalRelation.java | 6 + .../optimizer/LogicalPlanOptimizer.java | 26 + .../planner/optimizer/pattern/Patterns.java | 11 + .../rule/CreatePagingTableScanBuilder.java | 49 ++ .../planner/optimizer/rule/PushPageSize.java | 60 +++ .../sql/planner/physical/FilterOperator.java | 16 +- .../planner/physical/PaginateOperator.java | 84 ++++ .../sql/planner/physical/PhysicalPlan.java | 31 +- .../physical/PhysicalPlanNodeVisitor.java | 4 + .../sql/planner/physical/ProjectOperator.java | 15 + .../sql/planner/physical/ValuesOperator.java | 10 +- .../opensearch/sql/storage/StorageEngine.java | 6 + .../org/opensearch/sql/storage/Table.java | 5 + .../opensearch/sql/analysis/AnalyzerTest.java | 9 + .../sql/executor/QueryServiceTest.java | 12 +- .../execution/ContinuePaginatedPlanTest.java | 96 ++++ .../executor/execution/PaginatedPlanTest.java | 100 ++++ .../execution/QueryPlanFactoryTest.java | 60 ++- .../pagination/CanPaginateVisitorTest.java | 132 +++++ .../sql/executor/pagination/CursorTest.java | 27 ++ .../pagination/PaginatedPlanCacheTest.java | 459 ++++++++++++++++++ .../MicroBatchStreamingExecutionTest.java | 4 +- .../DefaultExpressionSerializerTest.java | 2 + .../sql/planner/DefaultImplementorTest.java | 31 +- .../logical/LogicalPlanNodeVisitorTest.java | 160 ++---- .../optimizer/LogicalPlanOptimizerTest.java | 60 ++- .../optimizer/pattern/PatternsTest.java | 40 +- .../planner/physical/FilterOperatorTest.java | 32 +- .../physical/PaginateOperatorTest.java | 104 ++++ .../physical/PhysicalPlanNodeVisitorTest.java | 8 + .../planner/physical/PhysicalPlanTest.java | 45 +- .../planner/physical/ProjectOperatorTest.java | 22 +- .../planner/physical/RemoveOperatorTest.java | 5 +- .../planner/physical/ValuesOperatorTest.java | 2 + .../sql/storage/StorageEngineTest.java | 7 +- .../org/opensearch/sql/storage/TableTest.java | 25 + .../sql/executor/DefaultExecutionEngine.java | 4 +- integ-test/build.gradle | 15 +- .../org/opensearch/sql/legacy/CursorIT.java | 29 +- .../sql/legacy/SQLIntegTestCase.java | 11 + .../org/opensearch/sql/ppl/StandaloneIT.java | 62 +-- .../sql/sql/HighlightFunctionIT.java | 2 +- .../sql/sql/PaginationBlackboxIT.java | 117 +++++ .../sql/sql/PaginationFallbackIT.java | 131 +++++ .../org/opensearch/sql/sql/PaginationIT.java | 48 ++ .../sql/sql/PaginationWindowIT.java | 98 ++++ .../sql/sql/StandalonePaginationIT.java | 168 +++++++ .../sql/util/InternalRestHighLevelClient.java | 19 + .../opensearch/sql/util/StandaloneModule.java | 123 +++++ .../org/opensearch/sql/util/TestUtils.java | 31 +- .../sql/legacy/plugin/RestSQLQueryAction.java | 8 +- .../sql/legacy/plugin/RestSqlAction.java | 18 +- .../RestSQLQueryActionCursorFallbackTest.java | 127 +++++ .../legacy/plugin/RestSQLQueryActionTest.java | 8 +- .../client/OpenSearchNodeClient.java | 11 +- .../client/OpenSearchRestClient.java | 1 - .../executor/OpenSearchExecutionEngine.java | 5 +- .../OpenSearchExecutionProtector.java | 7 + .../protector/ResourceMonitorPlan.java | 10 + .../request/ContinuePageRequest.java | 80 +++ .../request/ContinuePageRequestBuilder.java | 28 ++ .../request/InitialPageRequestBuilder.java | 67 +++ .../request/OpenSearchQueryRequest.java | 8 +- .../opensearch/request/OpenSearchRequest.java | 6 +- .../request/OpenSearchRequestBuilder.java | 34 +- .../request/OpenSearchScrollRequest.java | 32 +- .../request/PagedRequestBuilder.java | 12 + .../request/PushDownRequestBuilder.java | 63 +++ .../response/OpenSearchResponse.java | 10 +- .../opensearch/storage/OpenSearchIndex.java | 19 +- .../storage/OpenSearchStorageEngine.java | 16 + .../{ => scan}/OpenSearchIndexScan.java | 28 +- ...OpenSearchIndexScanAggregationBuilder.java | 3 +- .../scan/OpenSearchIndexScanBuilder.java | 1 - .../scan/OpenSearchIndexScanQueryBuilder.java | 5 +- .../scan/OpenSearchPagedIndexScan.java | 84 ++++ .../scan/OpenSearchPagedIndexScanBuilder.java | 29 ++ .../script/ExpressionScriptEngine.java | 2 +- .../aggregation/AggregationQueryBuilder.java | 4 +- .../dsl/AggregationBuilderHelper.java | 2 +- .../dsl/BucketAggregationBuilder.java | 2 +- .../dsl/MetricAggregationBuilder.java | 2 +- .../script/filter/FilterQueryBuilder.java | 2 +- .../system/OpenSearchSystemIndexScan.java | 11 +- .../client/OpenSearchNodeClientTest.java | 54 ++- .../client/OpenSearchRestClientTest.java | 53 +- .../OpenSearchExecutionEngineTest.java | 114 ++++- .../executor/ResourceMonitorPlanTest.java | 12 + .../OpenSearchExecutionProtectorTest.java | 27 +- .../ContinuePageRequestBuilderTest.java | 48 ++ .../request/ContinuePageRequestTest.java | 124 +++++ .../InitialPageRequestBuilderTest.java | 109 +++++ .../request/OpenSearchQueryRequestTest.java | 3 +- .../request/OpenSearchRequestBuilderTest.java | 27 +- .../request/OpenSearchRequestTest.java | 23 + .../request/OpenSearchScrollRequestTest.java | 75 ++- .../request/PushDownRequestBuilderTest.java | 44 ++ .../response/OpenSearchResponseTest.java | 23 +- .../storage/OpenSearchIndexTest.java | 35 +- .../storage/OpenSearchStorageEngineTest.java | 37 +- .../OpenSearchIndexScanOptimizationTest.java | 3 +- .../{ => scan}/OpenSearchIndexScanTest.java | 176 ++++--- .../scan/OpenSearchPagedIndexScanTest.java | 164 +++++++ .../script/ExpressionScriptEngineTest.java | 2 +- .../AggregationQueryBuilderTest.java | 2 +- .../dsl/BucketAggregationBuilderTest.java | 2 +- .../dsl/MetricAggregationBuilderTest.java | 2 +- .../script/filter/FilterQueryBuilderTest.java | 2 +- .../system/OpenSearchSystemIndexScanTest.java | 1 + plugin/build.gradle | 1 + .../org/opensearch/sql/plugin/SQLPlugin.java | 2 +- .../plugin/config/OpenSearchPluginModule.java | 21 +- .../transport/TransportPPLQueryAction.java | 3 +- .../org/opensearch/sql/ppl/PPLService.java | 3 +- .../sql/ppl/parser/AstStatementBuilder.java | 3 +- .../opensearch/sql/ppl/PPLServiceTest.java | 14 +- .../ppl/parser/AstStatementBuilderTest.java | 5 +- .../sql/protocol/response/QueryResult.java | 11 + .../format/JdbcResponseFormatter.java | 8 +- .../protocol/response/QueryResultTest.java | 12 +- .../format/JdbcResponseFormatterTest.java | 32 ++ .../org/opensearch/sql/sql/SQLService.java | 28 +- .../sql/sql/domain/SQLQueryRequest.java | 59 ++- .../sql/sql/parser/AstStatementBuilder.java | 3 +- .../opensearch/sql/sql/SQLServiceTest.java | 70 ++- .../sql/sql/domain/SQLQueryRequestTest.java | 186 +++++-- 147 files changed, 4975 insertions(+), 578 deletions(-) create mode 100644 core/src/main/java/org/opensearch/sql/ast/tree/Paginate.java create mode 100644 core/src/main/java/org/opensearch/sql/exception/UnsupportedCursorRequestException.java create mode 100644 core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java create mode 100644 core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java create mode 100644 core/src/main/java/org/opensearch/sql/executor/pagination/CanPaginateVisitor.java create mode 100644 core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java create mode 100644 core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java rename {opensearch/src/main/java/org/opensearch/sql/opensearch/storage => core/src/main/java/org/opensearch/sql/expression}/serialization/DefaultExpressionSerializer.java (95%) rename {opensearch/src/main/java/org/opensearch/sql/opensearch/storage => core/src/main/java/org/opensearch/sql/expression}/serialization/ExpressionSerializer.java (90%) create mode 100644 core/src/main/java/org/opensearch/sql/planner/logical/LogicalPaginate.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java create mode 100644 core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java create mode 100644 core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java create mode 100644 core/src/test/java/org/opensearch/sql/executor/pagination/CanPaginateVisitorTest.java create mode 100644 core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java create mode 100644 core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java rename {opensearch/src/test/java/org/opensearch/sql/opensearch/storage => core/src/test/java/org/opensearch/sql/expression}/serialization/DefaultExpressionSerializerTest.java (94%) create mode 100644 core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java create mode 100644 core/src/test/java/org/opensearch/sql/storage/TableTest.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/PaginationBlackboxIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/PaginationFallbackIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/PaginationWindowIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/util/InternalRestHighLevelClient.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java create mode 100644 legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionCursorFallbackTest.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/request/PagedRequestBuilder.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java rename opensearch/src/main/java/org/opensearch/sql/opensearch/storage/{ => scan}/OpenSearchIndexScan.java (75%) create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java create mode 100644 opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanBuilder.java create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestTest.java create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java rename opensearch/src/test/java/org/opensearch/sql/opensearch/storage/{ => scan}/OpenSearchIndexScanTest.java (60%) create mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java diff --git a/core/build.gradle b/core/build.gradle index a338b8f3682..624c10fd6b4 100644 --- a/core/build.gradle +++ b/core/build.gradle @@ -57,6 +57,7 @@ dependencies { testImplementation('org.junit.jupiter:junit-jupiter:5.6.2') testImplementation group: 'org.hamcrest', name: 'hamcrest-library', version: '2.1' testImplementation group: 'org.mockito', name: 'mockito-core', version: '3.12.4' + testImplementation group: 'org.mockito', name: 'mockito-inline', version: '3.12.4' testImplementation group: 'org.mockito', name: 'mockito-junit-jupiter', version: '3.12.4' } diff --git a/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java b/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java index ba400207821..0c1be4319bd 100644 --- a/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java +++ b/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java @@ -49,6 +49,7 @@ import org.opensearch.sql.ast.tree.Kmeans; import org.opensearch.sql.ast.tree.Limit; import org.opensearch.sql.ast.tree.ML; +import org.opensearch.sql.ast.tree.Paginate; import org.opensearch.sql.ast.tree.Parse; import org.opensearch.sql.ast.tree.Project; import org.opensearch.sql.ast.tree.RareTopN; @@ -83,6 +84,7 @@ import org.opensearch.sql.planner.logical.LogicalLimit; import org.opensearch.sql.planner.logical.LogicalML; import org.opensearch.sql.planner.logical.LogicalMLCommons; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalProject; import org.opensearch.sql.planner.logical.LogicalRareTopN; @@ -520,6 +522,12 @@ public LogicalPlan visitML(ML node, AnalysisContext context) { return new LogicalML(child, node.getArguments()); } + @Override + public LogicalPlan visitPaginate(Paginate paginate, AnalysisContext context) { + LogicalPlan child = paginate.getChild().get(0).accept(this, context); + return new LogicalPaginate(paginate.getPageSize(), List.of(child)); + } + /** * The first argument is always "asc", others are optional. * Given nullFirst argument, use its value. Otherwise just use DEFAULT_ASC/DESC. diff --git a/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java b/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java index 393de051649..adcde61d426 100644 --- a/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java @@ -47,6 +47,7 @@ import org.opensearch.sql.ast.tree.Kmeans; import org.opensearch.sql.ast.tree.Limit; import org.opensearch.sql.ast.tree.ML; +import org.opensearch.sql.ast.tree.Paginate; import org.opensearch.sql.ast.tree.Parse; import org.opensearch.sql.ast.tree.Project; import org.opensearch.sql.ast.tree.RareTopN; @@ -289,4 +290,8 @@ public T visitQuery(Query node, C context) { public T visitExplain(Explain node, C context) { return visitStatement(node, context); } + + public T visitPaginate(Paginate paginate, C context) { + return visitChildren(paginate, context); + } } diff --git a/core/src/main/java/org/opensearch/sql/ast/statement/Query.java b/core/src/main/java/org/opensearch/sql/ast/statement/Query.java index 17682cd47b9..82efdde4ddc 100644 --- a/core/src/main/java/org/opensearch/sql/ast/statement/Query.java +++ b/core/src/main/java/org/opensearch/sql/ast/statement/Query.java @@ -27,6 +27,7 @@ public class Query extends Statement { protected final UnresolvedPlan plan; + protected final int fetchSize; @Override public R accept(AbstractNodeVisitor visitor, C context) { diff --git a/core/src/main/java/org/opensearch/sql/ast/tree/Paginate.java b/core/src/main/java/org/opensearch/sql/ast/tree/Paginate.java new file mode 100644 index 00000000000..55e0e8c7a6c --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/ast/tree/Paginate.java @@ -0,0 +1,48 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.ast.tree; + +import java.util.List; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import lombok.ToString; +import org.opensearch.sql.ast.AbstractNodeVisitor; +import org.opensearch.sql.ast.Node; + +/** + * AST node to represent pagination operation. + * Actually a wrapper to the AST. + */ +@RequiredArgsConstructor +@EqualsAndHashCode(callSuper = false) +@ToString +public class Paginate extends UnresolvedPlan { + @Getter + private final int pageSize; + private UnresolvedPlan child; + + public Paginate(int pageSize, UnresolvedPlan child) { + this.pageSize = pageSize; + this.child = child; + } + + @Override + public List getChild() { + return List.of(child); + } + + @Override + public T accept(AbstractNodeVisitor nodeVisitor, C context) { + return nodeVisitor.visitPaginate(this, context); + } + + @Override + public UnresolvedPlan attach(UnresolvedPlan child) { + this.child = child; + return this; + } +} diff --git a/core/src/main/java/org/opensearch/sql/datasource/model/DataSourceMetadata.java b/core/src/main/java/org/opensearch/sql/datasource/model/DataSourceMetadata.java index 27d06d81518..7945f8aec3a 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/model/DataSourceMetadata.java +++ b/core/src/main/java/org/opensearch/sql/datasource/model/DataSourceMetadata.java @@ -12,8 +12,6 @@ import com.fasterxml.jackson.annotation.JsonIgnoreProperties; import com.fasterxml.jackson.annotation.JsonProperty; import com.google.common.collect.ImmutableMap; -import com.google.gson.Gson; -import java.io.IOException; import java.util.Collections; import java.util.List; import java.util.Map; @@ -21,7 +19,6 @@ import lombok.EqualsAndHashCode; import lombok.Getter; import lombok.NoArgsConstructor; -import lombok.RequiredArgsConstructor; import lombok.Setter; import org.opensearch.sql.datasource.DataSourceService; diff --git a/core/src/main/java/org/opensearch/sql/exception/UnsupportedCursorRequestException.java b/core/src/main/java/org/opensearch/sql/exception/UnsupportedCursorRequestException.java new file mode 100644 index 00000000000..6ed8e02e5fc --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/exception/UnsupportedCursorRequestException.java @@ -0,0 +1,12 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.exception; + +/** + * This should be thrown by V2 engine to support fallback scenario. + */ +public class UnsupportedCursorRequestException extends RuntimeException { +} diff --git a/core/src/main/java/org/opensearch/sql/executor/ExecutionEngine.java b/core/src/main/java/org/opensearch/sql/executor/ExecutionEngine.java index 1936a0f5178..8d87bd9b146 100644 --- a/core/src/main/java/org/opensearch/sql/executor/ExecutionEngine.java +++ b/core/src/main/java/org/opensearch/sql/executor/ExecutionEngine.java @@ -14,6 +14,7 @@ import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.data.type.ExprType; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.planner.physical.PhysicalPlan; /** @@ -53,6 +54,8 @@ void execute(PhysicalPlan plan, ExecutionContext context, class QueryResponse { private final Schema schema; private final List results; + private final long total; + private final Cursor cursor; } @Data diff --git a/core/src/main/java/org/opensearch/sql/executor/QueryId.java b/core/src/main/java/org/opensearch/sql/executor/QueryId.java index 933cb5d82dc..43d6aed85eb 100644 --- a/core/src/main/java/org/opensearch/sql/executor/QueryId.java +++ b/core/src/main/java/org/opensearch/sql/executor/QueryId.java @@ -16,6 +16,7 @@ * Query id of {@link AbstractPlan}. */ public class QueryId { + public static final QueryId None = new QueryId(""); /** * Query id. */ diff --git a/core/src/main/java/org/opensearch/sql/executor/QueryService.java b/core/src/main/java/org/opensearch/sql/executor/QueryService.java index 94e70819204..7870b147558 100644 --- a/core/src/main/java/org/opensearch/sql/executor/QueryService.java +++ b/core/src/main/java/org/opensearch/sql/executor/QueryService.java @@ -15,7 +15,9 @@ import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.planner.PlanContext; import org.opensearch.sql.planner.Planner; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; import org.opensearch.sql.planner.physical.PhysicalPlan; /** @@ -28,7 +30,15 @@ public class QueryService { private final ExecutionEngine executionEngine; + /** + * There are two planners, one - to handle pagination requests (cursor/scroll) only and + * another one for everything else. + * @see OpenSearchPluginModule#queryPlanFactory (:plugin module) + * @see LogicalPlanOptimizer#paginationCreate + * @see QueryService + */ private final Planner planner; + private final Planner paginationPlanner; /** * Execute the {@link UnresolvedPlan}, using {@link ResponseListener} to get response. @@ -46,6 +56,14 @@ public void execute(UnresolvedPlan plan, } } + /** + * Execute a physical plan without analyzing or planning anything. + */ + public void executePlan(PhysicalPlan plan, + ResponseListener listener) { + executionEngine.execute(plan, ExecutionContext.emptyExecutionContext(), listener); + } + /** * Execute the {@link UnresolvedPlan}, with {@link PlanContext} and using {@link ResponseListener} * to get response. @@ -97,6 +115,6 @@ public LogicalPlan analyze(UnresolvedPlan plan) { * Translate {@link LogicalPlan} to {@link PhysicalPlan}. */ public PhysicalPlan plan(LogicalPlan plan) { - return planner.plan(plan); + return plan instanceof LogicalPaginate ? paginationPlanner.plan(plan) : planner.plan(plan); } } diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java new file mode 100644 index 00000000000..03309359a1a --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java @@ -0,0 +1,59 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.execution; + +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.QueryId; +import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.planner.physical.PhysicalPlan; + +/** + * ContinuePaginatedPlan represents cursor a request. + * It returns subsequent pages to the user (2nd page and all next). + * {@link PaginatedPlan} + */ +public class ContinuePaginatedPlan extends AbstractPlan { + + private final String cursor; + private final QueryService queryService; + private final PaginatedPlanCache paginatedPlanCache; + + private final ResponseListener queryResponseListener; + + + /** + * Create an abstract plan that can continue paginating a given cursor. + */ + public ContinuePaginatedPlan(QueryId queryId, String cursor, QueryService queryService, + PaginatedPlanCache planCache, + ResponseListener + queryResponseListener) { + super(queryId); + this.cursor = cursor; + this.paginatedPlanCache = planCache; + this.queryService = queryService; + this.queryResponseListener = queryResponseListener; + } + + @Override + public void execute() { + try { + PhysicalPlan plan = paginatedPlanCache.convertToPlan(cursor); + queryService.executePlan(plan, queryResponseListener); + } catch (Exception e) { + queryResponseListener.onFailure(e); + } + } + + @Override + public void explain(ResponseListener listener) { + listener.onFailure(new UnsupportedOperationException( + "Explain of a paged query continuation is not supported. " + + "Use `explain` for the initial query request.")); + } +} diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java new file mode 100644 index 00000000000..5e217f13200 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java @@ -0,0 +1,53 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.execution; + +import org.apache.commons.lang3.NotImplementedException; +import org.opensearch.sql.ast.tree.Paginate; +import org.opensearch.sql.ast.tree.UnresolvedPlan; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.QueryId; +import org.opensearch.sql.executor.QueryService; + +/** + * PaginatedPlan represents a page request. Dislike a regular QueryPlan, + * it returns paged response to the user and cursor, which allows to query + * next page. + * {@link ContinuePaginatedPlan} + */ +public class PaginatedPlan extends AbstractPlan { + final UnresolvedPlan plan; + final int fetchSize; + final QueryService queryService; + final ResponseListener + queryResponseResponseListener; + + /** + * Create an abstract plan that can start paging a query. + */ + public PaginatedPlan(QueryId queryId, UnresolvedPlan plan, int fetchSize, + QueryService queryService, + ResponseListener + queryResponseResponseListener) { + super(queryId); + this.plan = plan; + this.fetchSize = fetchSize; + this.queryService = queryService; + this.queryResponseResponseListener = queryResponseResponseListener; + } + + @Override + public void execute() { + queryService.execute(new Paginate(fetchSize, plan), queryResponseResponseListener); + } + + @Override + public void explain(ResponseListener listener) { + listener.onFailure(new NotImplementedException( + "`explain` feature for paginated requests is not implemented yet.")); + } +} diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java index 851381cc7a4..cabbfbff8ea 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java @@ -18,9 +18,11 @@ import org.opensearch.sql.ast.statement.Query; import org.opensearch.sql.ast.statement.Statement; import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.exception.UnsupportedCursorRequestException; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; /** * QueryExecution Factory. @@ -37,9 +39,10 @@ public class QueryPlanFactory * Query Service. */ private final QueryService queryService; + private final PaginatedPlanCache paginatedPlanCache; /** - * NO_CONSUMER_RESPONSE_LISTENER should never been called. It is only used as constructor + * NO_CONSUMER_RESPONSE_LISTENER should never be called. It is only used as constructor * parameter of {@link QueryPlan}. */ @VisibleForTesting @@ -62,39 +65,62 @@ public void onFailure(Exception e) { /** * Create QueryExecution from Statement. */ - public AbstractPlan create( + public AbstractPlan createContinuePaginatedPlan( Statement statement, Optional> queryListener, Optional> explainListener) { return statement.accept(this, Pair.of(queryListener, explainListener)); } + /** + * Creates a ContinuePaginatedPlan from a cursor. + */ + public AbstractPlan createContinuePaginatedPlan(String cursor, boolean isExplain, + ResponseListener queryResponseListener, + ResponseListener explainListener) { + QueryId queryId = QueryId.queryId(); + var plan = new ContinuePaginatedPlan(queryId, cursor, queryService, + paginatedPlanCache, queryResponseListener); + return isExplain ? new ExplainPlan(queryId, plan, explainListener) : plan; + } + @Override public AbstractPlan visitQuery( Query node, - Pair< - Optional>, - Optional>> + Pair>, + Optional>> context) { Preconditions.checkArgument( context.getLeft().isPresent(), "[BUG] query listener must be not null"); - return new QueryPlan(QueryId.queryId(), node.getPlan(), queryService, context.getLeft().get()); + if (node.getFetchSize() > 0) { + if (paginatedPlanCache.canConvertToCursor(node.getPlan())) { + return new PaginatedPlan(QueryId.queryId(), node.getPlan(), node.getFetchSize(), + queryService, + context.getLeft().get()); + } else { + // This should be picked up by the legacy engine. + throw new UnsupportedCursorRequestException(); + } + } else { + return new QueryPlan(QueryId.queryId(), node.getPlan(), queryService, + context.getLeft().get()); + } } @Override public AbstractPlan visitExplain( Explain node, - Pair< - Optional>, - Optional>> + Pair>, + Optional>> context) { Preconditions.checkArgument( context.getRight().isPresent(), "[BUG] explain listener must be not null"); return new ExplainPlan( QueryId.queryId(), - create(node.getStatement(), Optional.of(NO_CONSUMER_RESPONSE_LISTENER), Optional.empty()), + createContinuePaginatedPlan(node.getStatement(), + Optional.of(NO_CONSUMER_RESPONSE_LISTENER), Optional.empty()), context.getRight().get()); } } diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/CanPaginateVisitor.java b/core/src/main/java/org/opensearch/sql/executor/pagination/CanPaginateVisitor.java new file mode 100644 index 00000000000..3164794abb6 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/CanPaginateVisitor.java @@ -0,0 +1,65 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import org.opensearch.sql.ast.AbstractNodeVisitor; +import org.opensearch.sql.ast.Node; +import org.opensearch.sql.ast.expression.AllFields; +import org.opensearch.sql.ast.tree.Project; +import org.opensearch.sql.ast.tree.Relation; + +/** + * Use this unresolved plan visitor to check if a plan can be serialized by PaginatedPlanCache. + * If plan.accept(new CanPaginateVisitor(...)) returns true, + * then PaginatedPlanCache.convertToCursor will succeed. Otherwise, it will fail. + * The purpose of this visitor is to activate legacy engine fallback mechanism. + * Currently, the conditions are: + * - only projection of a relation is supported. + * - projection only has * (a.k.a. allFields). + * - Relation only scans one table + * - The table is an open search index. + * So it accepts only queries like `select * from $index` + * See PaginatedPlanCache.canConvertToCursor for usage. + */ +public class CanPaginateVisitor extends AbstractNodeVisitor { + + @Override + public Boolean visitRelation(Relation node, Object context) { + if (!node.getChild().isEmpty()) { + // Relation instance should never have a child, but check just in case. + return Boolean.FALSE; + } + + return Boolean.TRUE; + } + + @Override + public Boolean visitChildren(Node node, Object context) { + return Boolean.FALSE; + } + + @Override + public Boolean visitProject(Project node, Object context) { + // Allow queries with 'SELECT *' only. Those restriction could be removed, but consider + // in-memory aggregation performed by window function (see WindowOperator). + // SELECT max(age) OVER (PARTITION BY city) ... + var projections = node.getProjectList(); + if (projections.size() != 1) { + return Boolean.FALSE; + } + + if (!(projections.get(0) instanceof AllFields)) { + return Boolean.FALSE; + } + + var children = node.getChild(); + if (children.size() != 1) { + return Boolean.FALSE; + } + + return children.get(0).accept(this, context); + } +} diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java b/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java new file mode 100644 index 00000000000..0339bec9cad --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java @@ -0,0 +1,29 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import lombok.EqualsAndHashCode; +import lombok.Getter; + +@EqualsAndHashCode +public class Cursor { + public static final Cursor None = new Cursor(); + + @Getter + private final byte[] raw; + + private Cursor() { + raw = new byte[] {}; + } + + public Cursor(byte[] raw) { + this.raw = raw; + } + + public String toString() { + return new String(raw); + } +} diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java b/core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java new file mode 100644 index 00000000000..89c008aa662 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java @@ -0,0 +1,161 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import com.google.common.hash.HashCode; +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; +import java.util.zip.GZIPInputStream; +import java.util.zip.GZIPOutputStream; +import lombok.RequiredArgsConstructor; +import org.opensearch.sql.ast.tree.UnresolvedPlan; +import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; +import org.opensearch.sql.planner.physical.PaginateOperator; +import org.opensearch.sql.planner.physical.PhysicalPlan; +import org.opensearch.sql.planner.physical.ProjectOperator; +import org.opensearch.sql.storage.StorageEngine; +import org.opensearch.sql.storage.TableScanOperator; + +/** + * This class is entry point to paged requests. It is responsible to cursor serialization + * and deserialization. + */ +@RequiredArgsConstructor +public class PaginatedPlanCache { + public static final String CURSOR_PREFIX = "n:"; + private final StorageEngine storageEngine; + + public boolean canConvertToCursor(UnresolvedPlan plan) { + return plan.accept(new CanPaginateVisitor(), null); + } + + /** + * Converts a physical plan tree to a cursor. May cache plan related data somewhere. + */ + public Cursor convertToCursor(PhysicalPlan plan) throws IOException { + if (plan instanceof PaginateOperator) { + var cursor = plan.toCursor(); + if (cursor == null) { + return Cursor.None; + } + var raw = CURSOR_PREFIX + compress(cursor); + return new Cursor(raw.getBytes()); + } + return Cursor.None; + } + + /** + * Compress serialized query plan. + * @param str string representing a query plan + * @return str compressed with gzip. + */ + String compress(String str) throws IOException { + if (str == null || str.length() == 0) { + return ""; + } + ByteArrayOutputStream out = new ByteArrayOutputStream(); + GZIPOutputStream gzip = new GZIPOutputStream(out); + gzip.write(str.getBytes()); + gzip.close(); + return HashCode.fromBytes(out.toByteArray()).toString(); + } + + /** + * Decompresses a query plan that was compress with {@link PaginatedPlanCache#compress}. + * @param input compressed query plan + * @return decompressed string + */ + String decompress(String input) throws IOException { + if (input == null || input.length() == 0) { + return ""; + } + GZIPInputStream gzip = new GZIPInputStream(new ByteArrayInputStream( + HashCode.fromString(input).asBytes())); + return new String(gzip.readAllBytes()); + } + + /** + * Parse `NamedExpression`s from cursor. + * @param listToFill List to fill with data. + * @param cursor Cursor to parse. + * @return Remaining part of the cursor. + */ + private String parseNamedExpressions(List listToFill, String cursor) { + var serializer = new DefaultExpressionSerializer(); + if (cursor.startsWith(")")) { //empty list + return cursor.substring(cursor.indexOf(',') + 1); + } + while (!cursor.startsWith("(")) { + listToFill.add((NamedExpression) + serializer.deserialize(cursor.substring(0, + Math.min(cursor.indexOf(','), cursor.indexOf(')'))))); + cursor = cursor.substring(cursor.indexOf(',') + 1); + } + return cursor; + } + + /** + * Converts a cursor to a physical plan tree. + */ + public PhysicalPlan convertToPlan(String cursor) { + if (!cursor.startsWith(CURSOR_PREFIX)) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + try { + cursor = cursor.substring(CURSOR_PREFIX.length()); + cursor = decompress(cursor); + + // TODO Parse with ANTLR or serialize as JSON/XML + if (!cursor.startsWith("(Paginate,")) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + // TODO add checks for > 0 + cursor = cursor.substring(cursor.indexOf(',') + 1); + final int currentPageIndex = Integer.parseInt(cursor, 0, cursor.indexOf(','), 10); + + cursor = cursor.substring(cursor.indexOf(',') + 1); + final int pageSize = Integer.parseInt(cursor, 0, cursor.indexOf(','), 10); + + cursor = cursor.substring(cursor.indexOf(',') + 1); + if (!cursor.startsWith("(Project,")) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + cursor = cursor.substring(cursor.indexOf(',') + 1); + if (!cursor.startsWith("(namedParseExpressions,")) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + + cursor = cursor.substring(cursor.indexOf(',') + 1); + List namedParseExpressions = new ArrayList<>(); + cursor = parseNamedExpressions(namedParseExpressions, cursor); + + List projectList = new ArrayList<>(); + if (!cursor.startsWith("(projectList,")) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + cursor = cursor.substring(cursor.indexOf(',') + 1); + cursor = parseNamedExpressions(projectList, cursor); + + if (!cursor.startsWith("(OpenSearchPagedIndexScan,")) { + throw new UnsupportedOperationException("Unsupported cursor"); + } + cursor = cursor.substring(cursor.indexOf(',') + 1); + var indexName = cursor.substring(0, cursor.indexOf(',')); + cursor = cursor.substring(cursor.indexOf(',') + 1); + var scrollId = cursor.substring(0, cursor.indexOf(')')); + TableScanOperator scan = storageEngine.getTableScan(indexName, scrollId); + + return new PaginateOperator(new ProjectOperator(scan, projectList, namedParseExpressions), + pageSize, currentPageIndex); + } catch (Exception e) { + throw new UnsupportedOperationException("Unsupported cursor", e); + } + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java b/core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java similarity index 95% rename from opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java rename to core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java index dc67da9de5d..33c22b2ea5d 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java +++ b/core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java @@ -4,7 +4,7 @@ */ -package org.opensearch.sql.opensearch.storage.serialization; +package org.opensearch.sql.expression.serialization; import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java b/core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java similarity index 90% rename from opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java rename to core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java index b7caeb30f81..f96921e29c3 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java +++ b/core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java @@ -4,7 +4,7 @@ */ -package org.opensearch.sql.opensearch.storage.serialization; +package org.opensearch.sql.expression.serialization; import org.opensearch.sql.expression.Expression; diff --git a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java index 4a6d4d82222..43422d87336 100644 --- a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java +++ b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java @@ -11,6 +11,7 @@ import org.opensearch.sql.planner.logical.LogicalEval; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanNodeVisitor; import org.opensearch.sql.planner.logical.LogicalProject; @@ -26,6 +27,7 @@ import org.opensearch.sql.planner.physical.EvalOperator; import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; +import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; import org.opensearch.sql.planner.physical.RareTopNOperator; @@ -125,6 +127,11 @@ public PhysicalPlan visitLimit(LogicalLimit node, C context) { return new LimitOperator(visitChild(node, context), node.getLimit(), node.getOffset()); } + @Override + public PhysicalPlan visitPaginate(LogicalPaginate plan, C context) { + return new PaginateOperator(visitChild(plan, context), plan.getPageSize()); + } + @Override public PhysicalPlan visitTableScanBuilder(TableScanBuilder plan, C context) { return plan.build(); @@ -145,5 +152,4 @@ protected PhysicalPlan visitChild(LogicalPlan node, C context) { // Logical operators visited here must have a single child return node.getChild().get(0).accept(this, context); } - } diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPaginate.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPaginate.java new file mode 100644 index 00000000000..372f9dcf0b6 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPaginate.java @@ -0,0 +1,31 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.logical; + +import java.util.List; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.ToString; + +/** + * LogicalPaginate represents pagination operation for underlying plan. + */ +@ToString +@EqualsAndHashCode(callSuper = false) +public class LogicalPaginate extends LogicalPlan { + @Getter + private final int pageSize; + + public LogicalPaginate(int pageSize, List childPlans) { + super(childPlans); + this.pageSize = pageSize; + } + + @Override + public R accept(LogicalPlanNodeVisitor visitor, C context) { + return visitor.visitPaginate(this, context); + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java index 9a41072fe7b..28cf6bcd792 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java @@ -100,4 +100,8 @@ public R visitML(LogicalML plan, C context) { public R visitAD(LogicalAD plan, C context) { return visitNode(plan, context); } + + public R visitPaginate(LogicalPaginate plan, C context) { + return visitNode(plan, context); + } } diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java index a49c3d5cbe3..0ece74690e7 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java @@ -9,6 +9,7 @@ import com.google.common.collect.ImmutableList; import lombok.EqualsAndHashCode; import lombok.Getter; +import lombok.Setter; import lombok.ToString; import org.opensearch.sql.storage.Table; @@ -25,6 +26,10 @@ public class LogicalRelation extends LogicalPlan { @Getter private final Table table; + @Getter + @Setter + private Integer pageSize; + /** * Constructor of LogicalRelation. */ @@ -32,6 +37,7 @@ public LogicalRelation(String relationName, Table table) { super(ImmutableList.of()); this.relationName = relationName; this.table = table; + this.pageSize = null; } @Override diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java index 70847b869b5..13bcfabe74d 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java @@ -13,8 +13,10 @@ import java.util.List; import java.util.stream.Collectors; import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.optimizer.rule.CreatePagingTableScanBuilder; import org.opensearch.sql.planner.optimizer.rule.MergeFilterAndFilter; import org.opensearch.sql.planner.optimizer.rule.PushFilterUnderSort; +import org.opensearch.sql.planner.optimizer.rule.PushPageSize; import org.opensearch.sql.planner.optimizer.rule.read.CreateTableScanBuilder; import org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown; import org.opensearch.sql.planner.optimizer.rule.write.CreateTableWriteBuilder; @@ -60,6 +62,30 @@ public static LogicalPlanOptimizer create() { new CreateTableWriteBuilder())); } + /** + * Create {@link LogicalPlanOptimizer} with pre-defined rules. + */ + public static LogicalPlanOptimizer paginationCreate() { + return new LogicalPlanOptimizer(Arrays.asList( + /* + * Phase 1: Transformations that rely on relational algebra equivalence + */ + new MergeFilterAndFilter(), + new PushFilterUnderSort(), + /* + * Phase 2: Transformations that rely on data source push down capability + */ + new PushPageSize(), + new CreatePagingTableScanBuilder(), + TableScanPushDown.PUSH_DOWN_FILTER, + TableScanPushDown.PUSH_DOWN_AGGREGATION, + TableScanPushDown.PUSH_DOWN_SORT, + TableScanPushDown.PUSH_DOWN_LIMIT, + TableScanPushDown.PUSH_DOWN_HIGHLIGHT, + TableScanPushDown.PUSH_DOWN_PROJECT, + new CreateTableWriteBuilder())); + } + /** * Optimize {@link LogicalPlan}. */ diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java index 856d8df7ead..6e548975063 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java @@ -16,6 +16,7 @@ import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalProject; import org.opensearch.sql.planner.logical.LogicalRelation; @@ -112,6 +113,16 @@ public static Property table() { : Optional.empty()); } + /** + * Logical pagination with page size. + */ + public static Property pagination() { + return Property.optionalProperty("pagination", + plan -> plan instanceof LogicalPaginate + ? Optional.of(((LogicalPaginate) plan).getPageSize()) + : Optional.empty()); + } + /** * Logical write with table field. */ diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java new file mode 100644 index 00000000000..22079ed9cac --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java @@ -0,0 +1,49 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.optimizer.rule; + +import static org.opensearch.sql.planner.optimizer.pattern.Patterns.table; + +import com.facebook.presto.matching.Capture; +import com.facebook.presto.matching.Captures; +import com.facebook.presto.matching.Pattern; +import lombok.Getter; +import lombok.experimental.Accessors; +import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.logical.LogicalRelation; +import org.opensearch.sql.planner.optimizer.Rule; +import org.opensearch.sql.storage.Table; +import org.opensearch.sql.storage.read.TableScanBuilder; + +/** + * Rule to create a paged TableScanBuilder in pagination request. + */ +public class CreatePagingTableScanBuilder implements Rule { + /** Capture the table inside matched logical relation operator. */ + private final Capture capture; + + /** Pattern that matches logical relation operator. */ + @Accessors(fluent = true) + @Getter + private final Pattern pattern; + + /** + * Constructor. + */ + public CreatePagingTableScanBuilder() { + this.capture = Capture.newCapture(); + this.pattern = Pattern.typeOf(LogicalRelation.class) + .with(table().capturedAs(capture)); + } + + @Override + public LogicalPlan apply(LogicalRelation plan, Captures captures) { + TableScanBuilder scanBuilder = captures.get(capture) + .createPagedScanBuilder(plan.getPageSize()); + // TODO: Remove this after Prometheus refactored to new table scan builder too + return (scanBuilder == null) ? plan : scanBuilder; + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java new file mode 100644 index 00000000000..95cd23d6ca0 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java @@ -0,0 +1,60 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.optimizer.rule; + +import static org.opensearch.sql.planner.optimizer.pattern.Patterns.pagination; + +import com.facebook.presto.matching.Capture; +import com.facebook.presto.matching.Captures; +import com.facebook.presto.matching.Pattern; +import lombok.Getter; +import lombok.experimental.Accessors; +import org.opensearch.sql.planner.logical.LogicalPaginate; +import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.logical.LogicalRelation; +import org.opensearch.sql.planner.optimizer.Rule; + +public class PushPageSize + implements Rule { + /** Capture the table inside matched logical paginate operator. */ + private final Capture capture; + + /** Pattern that matches logical paginate operator. */ + @Accessors(fluent = true) + @Getter + private final Pattern pattern; + + /** + * Constructor. + */ + public PushPageSize() { + this.capture = Capture.newCapture(); + this.pattern = Pattern.typeOf(LogicalPaginate.class) + .with(pagination().capturedAs(capture)); + } + + private LogicalRelation findLogicalRelation(LogicalPlan plan) { //TODO TBD multiple relations? + for (var subplan : plan.getChild()) { + if (subplan instanceof LogicalRelation) { + return (LogicalRelation) subplan; + } + var found = findLogicalRelation(subplan); + if (found != null) { + return found; + } + } + return null; + } + + @Override + public LogicalPlan apply(LogicalPaginate plan, Captures captures) { + var relation = findLogicalRelation(plan); + if (relation != null) { + relation.setPageSize(captures.get(capture)); + } + return plan; + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/FilterOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/FilterOperator.java index 86cd411a2da..a9c7597c3e5 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/FilterOperator.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/FilterOperator.java @@ -17,8 +17,9 @@ import org.opensearch.sql.storage.bindingtuple.BindingTuple; /** - * The Filter operator use the conditions to evaluate the input {@link BindingTuple}. - * The Filter operator only return the results that evaluated to true. + * The Filter operator represents WHERE clause and + * uses the conditions to evaluate the input {@link BindingTuple}. + * The Filter operator only returns the results that evaluated to true. * The NULL and MISSING are handled by the logic defined in {@link BinaryPredicateOperator}. */ @EqualsAndHashCode(callSuper = false) @@ -29,7 +30,9 @@ public class FilterOperator extends PhysicalPlan { private final PhysicalPlan input; @Getter private final Expression conditions; - @ToString.Exclude private ExprValue next = null; + @ToString.Exclude + private ExprValue next = null; + private long totalHits = 0; @Override public R accept(PhysicalPlanNodeVisitor visitor, C context) { @@ -48,6 +51,7 @@ public boolean hasNext() { ExprValue exprValue = conditions.valueOf(inputValue.bindingTuples()); if (!(exprValue.isNull() || exprValue.isMissing()) && (exprValue.booleanValue())) { next = inputValue; + totalHits++; return true; } } @@ -58,4 +62,10 @@ public boolean hasNext() { public ExprValue next() { return next; } + + @Override + public long getTotalHits() { + // ignore `input.getTotalHits()`, because it returns wrong (unfiltered) value + return totalHits; + } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java new file mode 100644 index 00000000000..97901def0fe --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java @@ -0,0 +1,84 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.physical; + +import java.util.List; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.planner.physical.PhysicalPlan; +import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; +import org.opensearch.sql.planner.physical.ProjectOperator; + +@RequiredArgsConstructor +@EqualsAndHashCode(callSuper = false) +public class PaginateOperator extends PhysicalPlan { + @Getter + private final PhysicalPlan input; + + @Getter + private final int pageSize; + + /** + * Which page is this? + * May not be necessary in the end. Currently used to increment the "cursor counter" -- + * See usage. + */ + @Getter + private final int pageIndex; + + int numReturned = 0; + + /** + * Page given physical plan, with pageSize elements per page, starting with the first page. + */ + public PaginateOperator(PhysicalPlan input, int pageSize) { + this.pageSize = pageSize; + this.input = input; + this.pageIndex = 0; + } + + @Override + public R accept(PhysicalPlanNodeVisitor visitor, C context) { + return visitor.visitPaginate(this, context); + } + + @Override + public boolean hasNext() { + return numReturned < pageSize && input.hasNext(); + } + + @Override + public ExprValue next() { + numReturned += 1; + return input.next(); + } + + public List getChild() { + return List.of(input); + } + + @Override + public ExecutionEngine.Schema schema() { + return input.schema(); + } + + @Override + public String toCursor() { + // Save cursor to read the next page. + // Could process node.getChild() here with another visitor -- one that saves the + // parameters for other physical operators -- ProjectOperator, etc. + // cursor format: n:|" + String child = getChild().get(0).toCursor(); + + var nextPage = getPageIndex() + 1; + return child == null || child.isEmpty() + ? null : createSection("Paginate", Integer.toString(nextPage), + Integer.toString(getPageSize()), child); + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java index b476b015577..312e4bfff9a 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java @@ -7,6 +7,7 @@ package org.opensearch.sql.planner.physical; import java.util.Iterator; +import java.util.List; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.planner.PlanNode; @@ -43,6 +44,34 @@ public void add(Split split) { public ExecutionEngine.Schema schema() { throw new IllegalStateException(String.format("[BUG] schema can been only applied to " - + "ProjectOperator, instead of %s", toString())); + + "ProjectOperator, instead of %s", this.getClass().getSimpleName())); + } + + /** + * Returns Total hits matched the search criteria. Note: query may return less if limited. + * {@see Settings#QUERY_SIZE_LIMIT}. + * Any plan which adds/removes rows to the response should overwrite it to provide valid values. + * + * @return Total hits matched the search criteria. + */ + public long getTotalHits() { + return getChild().stream().mapToLong(PhysicalPlan::getTotalHits).max().orElse(0); + } + + public String toCursor() { + throw new IllegalStateException(String.format("%s is not compatible with cursor feature", + this.getClass().getSimpleName())); + } + + /** + * Creates an S-expression that represents a plan node. + * @param plan Label for the plan. + * @param params List of serialized parameters. Including the child plans. + * @return A string that represents the plan called with those parameters. + */ + protected String createSection(String plan, String... params) { + return "(" + plan + "," + + String.join(",", params) + + ")"; } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java index d4bc4a1ea9f..f8b6f2243e0 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java @@ -88,4 +88,8 @@ public R visitAD(PhysicalPlan node, C context) { public R visitML(PhysicalPlan node, C context) { return visitNode(node, context); } + + public R visitPaginate(PaginateOperator node, C context) { + return visitNode(node, context); + } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java index 496e4e6ddb1..c61b35e0cb6 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java @@ -22,6 +22,7 @@ import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.parse.ParseExpression; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; /** * Project the fields specified in {@link ProjectOperator#projectList} from input. @@ -94,4 +95,18 @@ public ExecutionEngine.Schema schema() { .map(expr -> new ExecutionEngine.Schema.Column(expr.getName(), expr.getAlias(), expr.type())).collect(Collectors.toList())); } + + @Override + public String toCursor() { + String child = getChild().get(0).toCursor(); + if (child == null || child.isEmpty()) { + return null; + } + var serializer = new DefaultExpressionSerializer(); + String projects = createSection("projectList", + projectList.stream().map(serializer::serialize).toArray(String[]::new)); + String namedExpressions = createSection("namedParseExpressions", + namedParseExpressions.stream().map(serializer::serialize).toArray(String[]::new)); + return createSection("Project", namedExpressions, projects, child); + } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/ValuesOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/ValuesOperator.java index 51d2850df72..45884830e10 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/ValuesOperator.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/ValuesOperator.java @@ -15,6 +15,7 @@ import lombok.ToString; import org.opensearch.sql.data.model.ExprCollectionValue; import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.LiteralExpression; /** @@ -55,10 +56,17 @@ public boolean hasNext() { return valuesIterator.hasNext(); } + @Override + public long getTotalHits() { + // ValuesOperator used for queries without `FROM` clause, e.g. `select 1`. + // Such query always returns 1 row. + return 1; + } + @Override public ExprValue next() { List values = valuesIterator.next().stream() - .map(expr -> expr.valueOf()) + .map(Expression::valueOf) .collect(Collectors.toList()); return new ExprCollectionValue(values); } diff --git a/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java b/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java index 246a50ea093..18e9e92886c 100644 --- a/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java +++ b/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java @@ -8,7 +8,9 @@ import java.util.Collection; import java.util.Collections; +import java.util.List; import org.opensearch.sql.DataSourceSchemaName; +import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.function.FunctionResolver; /** @@ -30,4 +32,8 @@ default Collection getFunctions() { return Collections.emptyList(); } + default TableScanOperator getTableScan(String indexName, String scrollId) { + String error = String.format("%s.getTableScan needs to be implemented", getClass()); + throw new UnsupportedOperationException(error); + } } diff --git a/core/src/main/java/org/opensearch/sql/storage/Table.java b/core/src/main/java/org/opensearch/sql/storage/Table.java index 496281fa8d7..a7f2b606ca9 100644 --- a/core/src/main/java/org/opensearch/sql/storage/Table.java +++ b/core/src/main/java/org/opensearch/sql/storage/Table.java @@ -92,4 +92,9 @@ default TableWriteBuilder createWriteBuilder(LogicalWrite plan) { default StreamingSource asStreamingSource() { throw new UnsupportedOperationException(); } + + default TableScanBuilder createPagedScanBuilder(int pageSize) { + var error = String.format("'%s' does not support pagination", getClass().toString()); + throw new UnsupportedOperationException(error); + } } diff --git a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java index 1db29a6a42c..01e2091da93 100644 --- a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java +++ b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java @@ -75,6 +75,7 @@ import org.opensearch.sql.ast.tree.AD; import org.opensearch.sql.ast.tree.Kmeans; import org.opensearch.sql.ast.tree.ML; +import org.opensearch.sql.ast.tree.Paginate; import org.opensearch.sql.ast.tree.RareTopN.CommandType; import org.opensearch.sql.exception.ExpressionEvaluationException; import org.opensearch.sql.exception.SemanticCheckException; @@ -83,6 +84,7 @@ import org.opensearch.sql.expression.window.WindowDefinition; import org.opensearch.sql.planner.logical.LogicalAD; import org.opensearch.sql.planner.logical.LogicalMLCommons; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanDSL; import org.opensearch.sql.planner.logical.LogicalProject; @@ -1189,4 +1191,11 @@ public void ml_relation_predict_rcf_without_time_field() { assertTrue(((LogicalProject) actual).getProjectList() .contains(DSL.named(RCF_ANOMALOUS, DSL.ref(RCF_ANOMALOUS, BOOLEAN)))); } + + @Test + public void visit_paginate() { + LogicalPlan actual = analyze(new Paginate(10, AstDSL.relation("dummy"))); + assertTrue(actual instanceof LogicalPaginate); + assertEquals(10, ((LogicalPaginate) actual).getPageSize()); + } } diff --git a/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java b/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java index 4df38027f4f..e3e744d8ec4 100644 --- a/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java @@ -15,11 +15,9 @@ import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.doThrow; import static org.mockito.Mockito.lenient; -import static org.mockito.Mockito.when; import java.util.Collections; import java.util.Optional; -import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -27,6 +25,7 @@ import org.opensearch.sql.analysis.Analyzer; import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.planner.PlanContext; import org.opensearch.sql.planner.Planner; import org.opensearch.sql.planner.logical.LogicalPlan; @@ -47,6 +46,9 @@ class QueryServiceTest { @Mock private Planner planner; + @Mock + private Planner paginationPlanner; + @Mock private UnresolvedPlan ast; @@ -118,8 +120,9 @@ class Helper { public Helper() { lenient().when(analyzer.analyze(any(), any())).thenReturn(logicalPlan); lenient().when(planner.plan(any())).thenReturn(plan); + lenient().when(paginationPlanner.plan(any())).thenReturn(plan); - queryService = new QueryService(analyzer, executionEngine, planner); + queryService = new QueryService(analyzer, executionEngine, planner, paginationPlanner); } Helper executeSuccess() { @@ -134,7 +137,8 @@ Helper executeSuccess(Split split) { invocation -> { ResponseListener listener = invocation.getArgument(2); listener.onResponse( - new ExecutionEngine.QueryResponse(schema, Collections.emptyList())); + new ExecutionEngine.QueryResponse(schema, Collections.emptyList(), 0, + Cursor.None)); return null; }) .when(executionEngine) diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java new file mode 100644 index 00000000000..7ad2390e45d --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java @@ -0,0 +1,96 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.execution; + +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.fail; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.withSettings; +import static org.opensearch.sql.executor.pagination.PaginatedPlanCacheTest.buildCursor; + +import java.util.Map; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.DefaultExecutionEngine; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.QueryId; +import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.storage.StorageEngine; +import org.opensearch.sql.storage.TableScanOperator; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class ContinuePaginatedPlanTest { + + private static PaginatedPlanCache paginatedPlanCache; + + private static QueryService queryService; + + /** + * Initialize the mocks. + */ + @BeforeAll + public static void setUp() { + var storageEngine = mock(StorageEngine.class); + when(storageEngine.getTableScan(anyString(), anyString())) + .thenReturn(mock(TableScanOperator.class)); + paginatedPlanCache = new PaginatedPlanCache(storageEngine); + queryService = new QueryService(null, new DefaultExecutionEngine(), null, null); + } + + @Test + public void can_execute_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + assertNotNull(response); + } + + @Override + public void onFailure(Exception e) { + fail(); + } + }; + var plan = new ContinuePaginatedPlan(QueryId.None, buildCursor(Map.of()), + queryService, paginatedPlanCache, listener); + plan.execute(); + } + + @Test + // Same as previous test, but with malformed cursor + public void can_handle_error_while_executing_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + fail(); + } + + @Override + public void onFailure(Exception e) { + assertNotNull(e); + } + }; + var plan = new ContinuePaginatedPlan(QueryId.None, buildCursor(Map.of("pageSize", "abc")), + queryService, paginatedPlanCache, listener); + plan.execute(); + } + + @Test + public void explain_is_not_supported() { + var listener = mock(ResponseListener.class); + mock(ContinuePaginatedPlan.class, withSettings().defaultAnswer(CALLS_REAL_METHODS)) + .explain(listener); + verify(listener).onFailure(any(UnsupportedOperationException.class)); + } +} diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java new file mode 100644 index 00000000000..16933b9b791 --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java @@ -0,0 +1,100 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.execution; + +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +import org.apache.commons.lang3.NotImplementedException; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.opensearch.sql.analysis.Analyzer; +import org.opensearch.sql.ast.tree.UnresolvedPlan; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.DefaultExecutionEngine; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.QueryId; +import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.planner.Planner; +import org.opensearch.sql.planner.logical.LogicalPaginate; +import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.physical.PhysicalPlan; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class PaginatedPlanTest { + + private static QueryService queryService; + + /** + * Initialize the mocks. + */ + @BeforeAll + public static void setUp() { + var analyzer = mock(Analyzer.class); + when(analyzer.analyze(any(), any())).thenReturn(mock(LogicalPaginate.class)); + var planner = mock(Planner.class); + when(planner.plan(any())).thenReturn(mock(PhysicalPlan.class)); + queryService = new QueryService(analyzer, new DefaultExecutionEngine(), null, planner); + } + + @Test + public void can_execute_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + assertNotNull(response); + } + + @Override + public void onFailure(Exception e) { + fail(); + } + }; + var plan = new PaginatedPlan(QueryId.None, mock(UnresolvedPlan.class), 10, + queryService, listener); + plan.execute(); + } + + @Test + // Same as previous test, but with incomplete PaginatedQueryService + public void can_handle_error_while_executing_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + fail(); + } + + @Override + public void onFailure(Exception e) { + assertNotNull(e); + } + }; + var plan = new PaginatedPlan(QueryId.None, mock(UnresolvedPlan.class), 10, + new QueryService(null, new DefaultExecutionEngine(), null, null), listener); + plan.execute(); + } + + @Test + public void explain_is_not_supported() { + new PaginatedPlan(null, null, 0, null, null).explain(new ResponseListener<>() { + @Override + public void onResponse(ExecutionEngine.ExplainResponse response) { + fail(); + } + + @Override + public void onFailure(Exception e) { + assertTrue(e instanceof NotImplementedException); + } + }); + } +} diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java index cc4bf070fbe..c06b1186cd1 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java @@ -8,9 +8,11 @@ package org.opensearch.sql.executor.execution; +import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.when; import static org.opensearch.sql.executor.execution.QueryPlanFactory.NO_CONSUMER_RESPONSE_LISTENER; import java.util.Optional; @@ -24,8 +26,10 @@ import org.opensearch.sql.ast.statement.Statement; import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.exception.UnsupportedCursorRequestException; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; @ExtendWith(MockitoExtension.class) class QueryPlanFactoryTest { @@ -45,46 +49,60 @@ class QueryPlanFactoryTest { @Mock private ExecutionEngine.QueryResponse queryResponse; + @Mock + private PaginatedPlanCache paginatedPlanCache; private QueryPlanFactory factory; @BeforeEach void init() { - factory = new QueryPlanFactory(queryService); + factory = new QueryPlanFactory(queryService, paginatedPlanCache); } @Test public void createFromQueryShouldSuccess() { - Statement query = new Query(plan); + Statement query = new Query(plan, 0); AbstractPlan queryExecution = - factory.create(query, Optional.of(queryListener), Optional.empty()); + factory.createContinuePaginatedPlan(query, Optional.of(queryListener), Optional.empty()); assertTrue(queryExecution instanceof QueryPlan); } @Test public void createFromExplainShouldSuccess() { - Statement query = new Explain(new Query(plan)); + Statement query = new Explain(new Query(plan, 0)); AbstractPlan queryExecution = - factory.create(query, Optional.empty(), Optional.of(explainListener)); + factory.createContinuePaginatedPlan(query, Optional.empty(), Optional.of(explainListener)); assertTrue(queryExecution instanceof ExplainPlan); } + @Test + public void createFromCursorShouldSuccess() { + AbstractPlan queryExecution = factory.createContinuePaginatedPlan("", false, + queryListener, explainListener); + AbstractPlan explainExecution = factory.createContinuePaginatedPlan("", true, + queryListener, explainListener); + assertAll( + () -> assertTrue(queryExecution instanceof ContinuePaginatedPlan), + () -> assertTrue(explainExecution instanceof ExplainPlan) + ); + } + @Test public void createFromQueryWithoutQueryListenerShouldThrowException() { - Statement query = new Query(plan); + Statement query = new Query(plan, 0); IllegalArgumentException exception = - assertThrows(IllegalArgumentException.class, () -> factory.create(query, - Optional.empty(), Optional.empty())); + assertThrows(IllegalArgumentException.class, () -> factory.createContinuePaginatedPlan( + query, Optional.empty(), Optional.empty())); assertEquals("[BUG] query listener must be not null", exception.getMessage()); } @Test public void createFromExplainWithoutExplainListenerShouldThrowException() { - Statement query = new Explain(new Query(plan)); + Statement query = new Explain(new Query(plan, 0)); IllegalArgumentException exception = - assertThrows(IllegalArgumentException.class, () -> factory.create(query, - Optional.empty(), Optional.empty())); + assertThrows(IllegalArgumentException.class, () -> factory.createContinuePaginatedPlan( + query, Optional.empty(), Optional.empty())); assertEquals("[BUG] explain listener must be not null", exception.getMessage()); } @@ -104,4 +122,24 @@ public void noConsumerResponseChannel() { assertEquals( "[BUG] exception response should not sent to unexpected channel", exception.getMessage()); } + + @Test + public void createQueryWithFetchSizeWhichCanBePaged() { + when(paginatedPlanCache.canConvertToCursor(plan)).thenReturn(true); + factory = new QueryPlanFactory(queryService, paginatedPlanCache); + Statement query = new Query(plan, 10); + AbstractPlan queryExecution = + factory.createContinuePaginatedPlan(query, Optional.of(queryListener), Optional.empty()); + assertTrue(queryExecution instanceof PaginatedPlan); + } + + @Test + public void createQueryWithFetchSizeWhichCannotBePaged() { + when(paginatedPlanCache.canConvertToCursor(plan)).thenReturn(false); + factory = new QueryPlanFactory(queryService, paginatedPlanCache); + Statement query = new Query(plan, 10); + assertThrows(UnsupportedCursorRequestException.class, + () -> factory.createContinuePaginatedPlan(query, + Optional.of(queryListener), Optional.empty())); + } } diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/CanPaginateVisitorTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/CanPaginateVisitorTest.java new file mode 100644 index 00000000000..02a0dbc05e9 --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/CanPaginateVisitorTest.java @@ -0,0 +1,132 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.withSettings; + +import java.util.List; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.opensearch.sql.ast.dsl.AstDSL; +import org.opensearch.sql.ast.tree.Project; +import org.opensearch.sql.ast.tree.Relation; +import org.opensearch.sql.executor.pagination.CanPaginateVisitor; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class CanPaginateVisitorTest { + + static final CanPaginateVisitor visitor = new CanPaginateVisitor(); + + @Test + // select * from y + public void accept_query_with_select_star_and_from() { + var plan = AstDSL.project(AstDSL.relation("dummy"), AstDSL.allFields()); + assertTrue(plan.accept(visitor, null)); + } + + @Test + // select x from y + public void reject_query_with_select_field_and_from() { + var plan = AstDSL.project(AstDSL.relation("dummy"), AstDSL.field("pewpew")); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select x,z from y + public void reject_query_with_select_fields_and_from() { + var plan = AstDSL.project(AstDSL.relation("dummy"), + AstDSL.field("pewpew"), AstDSL.field("pewpew")); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select x + public void reject_query_without_from() { + var plan = AstDSL.project(AstDSL.values(List.of(AstDSL.intLiteral(1))), + AstDSL.alias("1",AstDSL.intLiteral(1))); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select * from y limit z + public void reject_query_with_limit() { + var plan = AstDSL.project(AstDSL.limit(AstDSL.relation("dummy"), 1, 2), AstDSL.allFields()); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select * from y where z + public void reject_query_with_where() { + var plan = AstDSL.project(AstDSL.filter(AstDSL.relation("dummy"), + AstDSL.booleanLiteral(true)), AstDSL.allFields()); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select * from y order by z + public void reject_query_with_order_by() { + var plan = AstDSL.project(AstDSL.sort(AstDSL.relation("dummy"), AstDSL.field("1")), + AstDSL.allFields()); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select * from y group by z + public void reject_query_with_group_by() { + var plan = AstDSL.project(AstDSL.agg( + AstDSL.relation("dummy"), List.of(), List.of(), List.of(AstDSL.field("1")), List.of()), + AstDSL.allFields()); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select agg(x) from y + public void reject_query_with_aggregation_function() { + var plan = AstDSL.project(AstDSL.agg( + AstDSL.relation("dummy"), + List.of(AstDSL.alias("agg", AstDSL.aggregate("func", AstDSL.field("pewpew")))), + List.of(), List.of(), List.of()), + AstDSL.allFields()); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select window(x) from y + public void reject_query_with_window_function() { + var plan = AstDSL.project(AstDSL.relation("dummy"), + AstDSL.alias("pewpew", + AstDSL.window( + AstDSL.aggregate("func", AstDSL.field("pewpew")), + List.of(AstDSL.qualifiedName("1")), List.of()))); + assertFalse(plan.accept(visitor, null)); + } + + @Test + // select * from y, z + public void reject_query_with_select_from_multiple_indices() { + var plan = mock(Project.class); + when(plan.getChild()).thenReturn(List.of(AstDSL.relation("dummy"), AstDSL.relation("pummy"))); + when(plan.getProjectList()).thenReturn(List.of(AstDSL.allFields())); + assertFalse(visitor.visitProject(plan, null)); + } + + @Test + // unreal case, added for coverage only + public void reject_project_when_relation_has_child() { + var relation = mock(Relation.class, withSettings().useConstructor(AstDSL.qualifiedName("42"))); + when(relation.getChild()).thenReturn(List.of(AstDSL.relation("pewpew"))); + when(relation.accept(visitor, null)).thenCallRealMethod(); + var plan = mock(Project.class); + when(plan.getChild()).thenReturn(List.of(relation)); + when(plan.getProjectList()).thenReturn(List.of(AstDSL.allFields())); + assertFalse(visitor.visitProject((Project) plan, null)); + } +} diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java new file mode 100644 index 00000000000..ff5e0d37a72 --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java @@ -0,0 +1,27 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.opensearch.sql.executor.pagination.Cursor; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +class CursorTest { + + @Test + void empty_array_is_none() { + Assertions.assertEquals(Cursor.None, new Cursor(new byte[]{})); + } + + @Test + void toString_is_array_value() { + String cursorTxt = "This is a test"; + Assertions.assertEquals(cursorTxt, new Cursor(cursorTxt.getBytes()).toString()); + } +} diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java new file mode 100644 index 00000000000..c3feb6e606d --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java @@ -0,0 +1,459 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.executor.pagination; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +import java.util.Map; +import java.util.stream.Stream; +import java.util.zip.GZIPOutputStream; +import lombok.SneakyThrows; +import org.apache.commons.lang3.reflect.FieldUtils; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; +import org.junit.jupiter.params.provider.ValueSource; +import org.mockito.Mockito; +import org.opensearch.sql.ast.dsl.AstDSL; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.planner.physical.PaginateOperator; +import org.opensearch.sql.storage.StorageEngine; +import org.opensearch.sql.storage.TableScanOperator; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class PaginatedPlanCacheTest { + + StorageEngine storageEngine; + + PaginatedPlanCache planCache; + + // encoded query 'select * from cacls' o_O + static final String testCursor = "(Paginate,1,2,(Project," + + "(namedParseExpressions,),(projectList,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5" + + "OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVk" + + "dAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZ" + + "y5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH" + + "4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHl" + + "wZS9FeHByVHlwZTt4cHQABWJvb2wzc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFh" + + "dAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAf" + + "gAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS" + + "5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZX" + + "hwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAA" + + "JZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgAB" + + "eHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA" + + "0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3" + + "FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABGludDBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYg" + + "G0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAA" + + "eHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAA" + + "HhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlY" + + "XJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy" + + "9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAA" + + "EbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274" + + "AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZ" + + "W5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABXRpbWUxc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYX" + + "lMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzu" + + "t0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBl" + + "AAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNy" + + "AC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzd" + + "AASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0" + + "V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5" + + "jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5" + + "cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWJvb2wyc3IAGmphdmEudXRpb" + + "C5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS" + + "5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGU" + + "uRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEA" + + "fgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIA" + + "A0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9le" + + "HByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW" + + "9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9" + + "MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABGludDJzcgAa" + + "amF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1c" + + "gATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLm" + + "RhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAd" + + "JTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb27" + + "4hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2Vh" + + "cmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxb" + + "C5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTG" + + "phdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQ" + + "ABGludDFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9P" + + "YmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZ" + + "WFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAA" + + "AAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZE" + + "V4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9" + + "yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVu" + + "c2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwAB" + + "XBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeH" + + "ByVHlwZTt4cHQABHN0cjNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGp" + + "hdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgAp" + + "b3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuR" + + "W51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc" + + "2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZW" + + "dhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3I" + + "AMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0" + + "dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2Rhd" + + "GEvdHlwZS9FeHByVHlwZTt4cHQABGludDNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAV" + + "sAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAA" + + "BcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5q" + + "YXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5z" + + "cWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpb" + + "mc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZX" + + "EAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWv" + + "MkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFy" + + "Y2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZp" + + "Dy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHX" + + "tHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAA" + + "AABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1vcmcub3B" + + "lbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEv" + + "bGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb" + + "247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3" + + "Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3J" + + "nL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjJzcgAaamF2YS51dGlsLkFycmF5cyRB" + + "cnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3Rya" + + "W5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZV" + + "R5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABX" + + "NyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWF" + + "zdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9u" + + "L0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZ" + + "W5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABH" + + "R5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABXRpbWUwc3IAGmphdmEudXR" + + "pbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2" + + "YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5c" + + "GUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU" + + "1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q" + + "2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3N" + + "xbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHBy" + + "ZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvd" + + "XRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQACWRhdG" + + "V0aW1lMHNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09" + + "iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNl" + + "YXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAA" + + "AASAAB4cHQACVRJTUVTVEFNUHEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZ" + + "EV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG" + + "9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGV" + + "uc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwA" + + "BXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9Fe" + + "HByVHlwZTt4cHQABG51bTFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTG" + + "phdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgA" + + "pb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcu" + + "RW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVz" + + "c2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZ" + + "WdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3" + + "IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF" + + "0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2Rh" + + "dGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAA" + + "VsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAA" + + "ABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5" + + "qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5" + + "zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJp" + + "bmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZ" + + "XEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxW" + + "vMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWF" + + "yY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQACWRhdGV0aW1lMXNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5" + + "TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7r" + + "dJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQ" + + "AAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAASAAB4cHQACVRJTUVTVEFNUHEAfgAI,rO0ABXNyAC" + + "1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAA" + + "STGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4" + + "cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZ" + + "UV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cG" + + "V0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTRzcgAaamF2YS51dGlsLkF" + + "ycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxh" + + "bmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5Fe" + + "HByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA" + + "==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0" + + "wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHB" + + "yZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9u" + + "LlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9Ma" + + "XN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWJvb2wxc3IAGm" + + "phdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXI" + + "AE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5k" + + "YXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQ" + + "k9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274h" + + "hKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2Vhcm" + + "NoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5" + + "leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGph" + + "dmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQAA" + + "2tleXNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iam" + + "VjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJ" + + "jaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAAS" + + "AAB4cHQABlNUUklOR3EAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJl" + + "c3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vc" + + "GVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2Vhcm" + + "NoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGh" + + "zdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlw" + + "ZTt4cHQABG51bTNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvb" + + "GFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm" + + "9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQA" + + "AAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5" + + "OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVk" + + "dAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZ" + + "y5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH" + + "4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHl" + + "wZS9FeHByVHlwZTt4cHQABWJvb2wwc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFh" + + "dAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAf" + + "gAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS" + + "5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZX" + + "hwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAA" + + "JZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgAB" + + "eHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA" + + "0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3" + + "FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTJzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYg" + + "G0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAA" + + "eHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAA" + + "HhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlY" + + "XJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy" + + "9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAA" + + "EbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274" + + "AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZ" + + "W5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheU" + + "xpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63" + + "SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUA" + + "AAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1v" + + "cmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAAST" + + "GphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cH" + + "Jlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV" + + "4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0" + + "ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWRhdGUzc3IAGmphdmEudXRpbC5Bc" + + "nJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW" + + "5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXh" + + "wckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+" + + "AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIA" + + "A0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9le" + + "HByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW" + + "9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9" + + "MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWRhdGUyc3IA" + + "GmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwd" + + "XIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC" + + "5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAA" + + "JVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3N" + + "pb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVu" + + "c2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoL" + + "nNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdA" + + "AQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt" + + "4cHQABWRhdGUxc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xh" + + "bmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vc" + + "GVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAA" + + "AAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi" + + "5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGV" + + "kdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9y" + + "Zy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxA" + + "H4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdH" + + "lwZS9FeHByVHlwZTt4cHQABWRhdGUwc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAF" + + "hdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEA" + + "fgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2Y" + + "S5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zc" + + "WwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbm" + + "c7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXE" + + "AfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvM" + + "kAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY" + + "2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQAA3p6enNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL" + + "7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0c" + + "CAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAA" + + "EgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAASAAB4cHQABlNUUklOR3EAfgAI),(OpenSearchPagedIndexSc" + + "an,calcs,FGluY2x1ZGVfY29udGV4dF91dWlkDXF1ZXJ5QW5kRmV0Y2gBFndYQmJZcHpxU3dtc1hUVkhhYU1uLVEA" + + "AAAAAAAADRY4RzRudHZqbFI0dTBFdkJNZEpCaDd3)))"; + + private static final String testIndexName = "dummyIndex"; + private static final String testScroll = "dummyScroll"; + + @BeforeEach + void setUp() { + storageEngine = mock(StorageEngine.class); + when(storageEngine.getTableScan(anyString(), anyString())) + .thenReturn(new MockedTableScanOperator()); + planCache = new PaginatedPlanCache(storageEngine); + } + + @Test + void canConvertToCursor_relation() { + assertTrue(planCache.canConvertToCursor(AstDSL.relation("Table"))); + } + + @Test + void canConvertToCursor_project_allFields_relation() { + var unresolvedPlan = AstDSL.project(AstDSL.relation("table"), AstDSL.allFields()); + assertTrue(planCache.canConvertToCursor(unresolvedPlan)); + } + + @Test + void canConvertToCursor_project_some_fields_relation() { + var unresolvedPlan = AstDSL.project(AstDSL.relation("table"), AstDSL.field("rando")); + Assertions.assertFalse(planCache.canConvertToCursor(unresolvedPlan)); + } + + @ParameterizedTest + @ValueSource(strings = {"pewpew", "asdkfhashdfjkgakgfwuigfaijkb", testCursor}) + void compress_decompress(String input) { + var compressed = compress(input); + assertEquals(input, decompress(compressed)); + if (input.length() > 200) { + // Compression of short strings isn't profitable, because encoding into string and gzip + // headers add more bytes than input string has. + assertTrue(compressed.length() < input.length()); + } + } + + @Test + // should never happen actually, at least for compress + void compress_decompress_null_or_empty_string() { + assertAll( + () -> assertTrue(compress(null).isEmpty()), + () -> assertTrue(compress("").isEmpty()), + () -> assertTrue(decompress(null).isEmpty()), + () -> assertTrue(decompress("").isEmpty()) + ); + } + + @Test + // test added for coverage only + void compress_throws() { + var mock = Mockito.mockConstructionWithAnswer(GZIPOutputStream.class, invocation -> null); + assertThrows(Throwable.class, () -> compress("\\_(`v`)_/")); + mock.close(); + } + + @Test + void decompress_throws() { + assertAll( + // from gzip - damaged header + () -> assertThrows(Throwable.class, () -> decompress("00")), + // from HashCode::fromString + () -> assertThrows(Throwable.class, () -> decompress("000")) + ); + } + + @Test + @SneakyThrows + void convert_deconvert_cursor() { + var cursor = buildCursor(Map.of()); + var plan = planCache.convertToPlan(cursor); + // `PaginateOperator::toCursor` shifts cursor to the next page. To have this test consistent + // we have to enforce it staying on the same page. This allows us to get same cursor strings. + var pageNum = (int)FieldUtils.readField(plan, "pageIndex", true); + FieldUtils.writeField(plan, "pageIndex", pageNum - 1, true); + var convertedCursor = planCache.convertToCursor(plan).toString(); + // Then we have to restore page num into the plan, otherwise comparison would fail due to this. + FieldUtils.writeField(plan, "pageIndex", pageNum, true); + var convertedPlan = planCache.convertToPlan(convertedCursor); + assertEquals(cursor, convertedCursor); + // TODO compare plans + } + + @Test + @SneakyThrows + void convertToCursor_cant_convert() { + var plan = mock(MockedTableScanOperator.class); + assertEquals(Cursor.None, planCache.convertToCursor(plan)); + when(plan.toCursor()).thenReturn(""); + assertEquals(Cursor.None, planCache.convertToCursor( + new PaginateOperator(plan, 1, 2))); + } + + @Test + void converted_plan_is_executable() { + // planCache.convertToPlan(buildCursor(Map.of())); + var plan = planCache.convertToPlan("n:" + compress(testCursor)); + // TODO + } + + @ParameterizedTest + @MethodSource("generateIncorrectCursors") + void throws_on_parsing_damaged_cursor(String cursor) { + assertThrows(Throwable.class, () -> planCache.convertToPlan(cursor)); + } + + private static Stream generateIncorrectCursors() { + return Stream.of( + compress(testCursor), // a valid cursor, but without "n:" prefix + "n:" + testCursor, // a valid, but uncompressed cursor + buildCursor(Map.of("prefix", "g:")), // incorrect prefix + buildCursor(Map.of("header: paginate", "ORDER BY")), // incorrect header + buildCursor(Map.of("pageIndex", "")), // incorrect page # + buildCursor(Map.of("pageIndex", "abc")), // incorrect page # + buildCursor(Map.of("pageSize", "abc")), // incorrect page size + buildCursor(Map.of("pageSize", "null")), // incorrect page size + buildCursor(Map.of("pageSize", "10 ")), // incorrect page size + buildCursor(Map.of("header: project", "")), // incorrect header + buildCursor(Map.of("header: namedParseExpressions", "ololo")), // incorrect header + buildCursor(Map.of("namedParseExpressions", "pewpew")), // incorrect (unparsable) npes + buildCursor(Map.of("namedParseExpressions", "rO0ABXA=,")), // incorrect npes (extra comma) + buildCursor(Map.of("header: projectList", "")), // incorrect header + buildCursor(Map.of("projectList", "\0\0\0\0")), // incorrect project + buildCursor(Map.of("header: OpenSearchPagedIndexScan", "42")) // incorrect header + ).map(Arguments::of); + } + + + /** + * Function puts default valid values into generated cursor string. + * Values could be redefined. + * @param values A map of non-default values to use. + * @return A compressed cursor string. + */ + public static String buildCursor(Map values) { + String prefix = values.getOrDefault("prefix", "n:"); + String headerPaginate = values.getOrDefault("header: paginate", "Paginate"); + String pageIndex = values.getOrDefault("pageIndex", "1"); + String pageSize = values.getOrDefault("pageSize", "2"); + String headerProject = values.getOrDefault("header: project", "Project"); + String headerNpes = values.getOrDefault("header: namedParseExpressions", + "namedParseExpressions"); + String namedParseExpressions = values.getOrDefault("namedParseExpressions", ""); + String headerProjectList = values.getOrDefault("header: projectList", "projectList"); + String projectList = values.getOrDefault("projectList", "rO0ABXA="); // serialized `null` + String headerOspis = values.getOrDefault("header: OpenSearchPagedIndexScan", + "OpenSearchPagedIndexScan"); + String indexName = values.getOrDefault("indexName", testIndexName); + String scrollId = values.getOrDefault("scrollId", testScroll); + var cursor = String.format("(%s,%s,%s,(%s,(%s,%s),(%s,%s),(%s,%s,%s)))", headerPaginate, + pageIndex, pageSize, headerProject, headerNpes, namedParseExpressions, headerProjectList, + projectList, headerOspis, indexName, scrollId); + return prefix + compress(cursor); + } + + private static class MockedTableScanOperator extends TableScanOperator { + @Override + public boolean hasNext() { + return false; + } + + @Override + public ExprValue next() { + return null; + } + + @Override + public String explain() { + return null; + } + + @Override + public String toCursor() { + return createSection("OpenSearchPagedIndexScan", testIndexName, testScroll); + } + } + + @SneakyThrows + private static String compress(String input) { + return new PaginatedPlanCache(null).compress(input); + } + + @SneakyThrows + private static String decompress(String input) { + return new PaginatedPlanCache(null).decompress(input); + } +} diff --git a/core/src/test/java/org/opensearch/sql/executor/streaming/MicroBatchStreamingExecutionTest.java b/core/src/test/java/org/opensearch/sql/executor/streaming/MicroBatchStreamingExecutionTest.java index 1a2b6e3f2a4..ceb53b756a5 100644 --- a/core/src/test/java/org/opensearch/sql/executor/streaming/MicroBatchStreamingExecutionTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/streaming/MicroBatchStreamingExecutionTest.java @@ -26,6 +26,7 @@ import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.planner.PlanContext; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.storage.split.Split; @@ -169,7 +170,8 @@ Helper executeSuccess(Long... offsets) { ResponseListener listener = invocation.getArgument(2); listener.onResponse( - new ExecutionEngine.QueryResponse(null, Collections.emptyList())); + new ExecutionEngine.QueryResponse(null, Collections.emptyList(), 0, + Cursor.None)); PlanContext planContext = invocation.getArgument(1); assertTrue(planContext.getSplit().isPresent()); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java b/core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java similarity index 94% rename from opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java rename to core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java index 72a319dbfe6..53a89d5421a 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java +++ b/core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java @@ -21,6 +21,8 @@ import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.ExpressionNodeVisitor; import org.opensearch.sql.expression.env.Environment; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class DefaultExpressionSerializerTest { diff --git a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java index 017cfb60ea0..da3f5315e46 100644 --- a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java @@ -33,6 +33,8 @@ import java.util.Map; import org.apache.commons.lang3.tuple.ImmutablePair; import org.apache.commons.lang3.tuple.Pair; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -50,9 +52,11 @@ import org.opensearch.sql.expression.aggregation.NamedAggregator; import org.opensearch.sql.expression.window.WindowDefinition; import org.opensearch.sql.expression.window.ranking.RowNumberFunction; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanDSL; import org.opensearch.sql.planner.logical.LogicalRelation; +import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; import org.opensearch.sql.storage.Table; @@ -62,24 +66,16 @@ import org.opensearch.sql.storage.write.TableWriteOperator; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class DefaultImplementorTest { - @Mock - private Expression filter; - - @Mock - private NamedAggregator aggregator; - - @Mock - private NamedExpression groupBy; - @Mock private Table table; private final DefaultImplementor implementor = new DefaultImplementor<>(); @Test - public void visitShouldReturnDefaultPhysicalOperator() { + public void visit_should_return_default_physical_operator() { String indexName = "test"; NamedExpression include = named("age", ref("age", INTEGER)); ReferenceExpression exclude = ref("name", STRING); @@ -157,14 +153,14 @@ public void visitShouldReturnDefaultPhysicalOperator() { } @Test - public void visitRelationShouldThrowException() { + public void visitRelation_should_throw_an_exception() { assertThrows(UnsupportedOperationException.class, () -> new LogicalRelation("test", table).accept(implementor, null)); } @SuppressWarnings({"rawtypes", "unchecked"}) @Test - public void visitWindowOperatorShouldReturnPhysicalWindowOperator() { + public void visitWindowOperator_should_return_PhysicalWindowOperator() { NamedExpression windowFunction = named(new RowNumberFunction()); WindowDefinition windowDefinition = new WindowDefinition( Collections.singletonList(ref("state", STRING)), @@ -204,7 +200,7 @@ public void visitWindowOperatorShouldReturnPhysicalWindowOperator() { } @Test - public void visitTableScanBuilderShouldBuildTableScanOperator() { + public void visitTableScanBuilder_should_build_TableScanOperator() { TableScanOperator tableScanOperator = Mockito.mock(TableScanOperator.class); TableScanBuilder tableScanBuilder = new TableScanBuilder() { @Override @@ -216,7 +212,7 @@ public TableScanOperator build() { } @Test - public void visitTableWriteBuilderShouldBuildTableWriteOperator() { + public void visitTableWriteBuilder_should_build_TableWriteOperator() { LogicalPlan child = values(); TableWriteOperator tableWriteOperator = Mockito.mock(TableWriteOperator.class); TableWriteBuilder logicalPlan = new TableWriteBuilder(child) { @@ -227,4 +223,11 @@ public TableWriteOperator build(PhysicalPlan child) { }; assertEquals(tableWriteOperator, logicalPlan.accept(implementor, null)); } + + @Test + public void visitPaginate_should_build_PaginateOperator_and_keep_page_size() { + var paginate = new LogicalPaginate(42, List.of(values())); + var plan = paginate.accept(implementor, null); + assertEquals(paginate.getPageSize(), ((PaginateOperator) plan).getPageSize()); + } } diff --git a/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java b/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java index 341bcbc29e4..c9d74fa8712 100644 --- a/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java @@ -8,21 +8,23 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertNull; +import static org.mockito.Mockito.mock; import static org.opensearch.sql.expression.DSL.named; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; import java.util.Collections; -import java.util.HashMap; +import java.util.List; import java.util.Map; -import java.util.stream.Collectors; +import java.util.stream.Stream; import org.apache.commons.lang3.tuple.Pair; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.Mock; -import org.mockito.junit.jupiter.MockitoExtension; -import org.opensearch.sql.ast.expression.DataType; -import org.opensearch.sql.ast.expression.Literal; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; import org.opensearch.sql.ast.tree.RareTopN.CommandType; import org.opensearch.sql.ast.tree.Sort.SortOption; import org.opensearch.sql.data.model.ExprValueUtils; @@ -42,20 +44,24 @@ /** * Todo. Temporary added for UT coverage, Will be removed. */ -@ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class LogicalPlanNodeVisitorTest { - @Mock - Expression expression; - @Mock - ReferenceExpression ref; - @Mock - Aggregator aggregator; - @Mock - Table table; + static Expression expression; + static ReferenceExpression ref; + static Aggregator aggregator; + static Table table; + + @BeforeAll + private static void initMocks() { + expression = mock(Expression.class); + ref = mock(ReferenceExpression.class); + aggregator = mock(Aggregator.class); + table = mock(Table.class); + } @Test - public void logicalPlanShouldTraversable() { + public void logical_plan_should_be_traversable() { LogicalPlan logicalPlan = LogicalPlanDSL.rename( LogicalPlanDSL.aggregation( @@ -72,119 +78,57 @@ public void logicalPlanShouldTraversable() { assertEquals(5, result); } - @Test - public void testAbstractPlanNodeVisitorShouldReturnNull() { + @SuppressWarnings("unchecked") + private static Stream getLogicalPlansForVisitorTest() { LogicalPlan relation = LogicalPlanDSL.relation("schema", table); - assertNull(relation.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan tableScanBuilder = new TableScanBuilder() { @Override public TableScanOperator build() { return null; } }; - assertNull(tableScanBuilder.accept(new LogicalPlanNodeVisitor() { - }, null)); - - LogicalPlan write = LogicalPlanDSL.write(null, table, Collections.emptyList()); - assertNull(write.accept(new LogicalPlanNodeVisitor() { - }, null)); - TableWriteBuilder tableWriteBuilder = new TableWriteBuilder(null) { @Override public TableWriteOperator build(PhysicalPlan child) { return null; } }; - assertNull(tableWriteBuilder.accept(new LogicalPlanNodeVisitor() { - }, null)); - + LogicalPlan write = LogicalPlanDSL.write(null, table, Collections.emptyList()); LogicalPlan filter = LogicalPlanDSL.filter(relation, expression); - assertNull(filter.accept(new LogicalPlanNodeVisitor() { - }, null)); - - LogicalPlan aggregation = - LogicalPlanDSL.aggregation( - filter, ImmutableList.of(DSL.named("avg", aggregator)), ImmutableList.of(DSL.named( - "group", expression))); - assertNull(aggregation.accept(new LogicalPlanNodeVisitor() { - }, null)); - + LogicalPlan aggregation = LogicalPlanDSL.aggregation( + filter, ImmutableList.of(DSL.named("avg", aggregator)), ImmutableList.of(DSL.named( + "group", expression))); LogicalPlan rename = LogicalPlanDSL.rename(aggregation, ImmutableMap.of(ref, ref)); - assertNull(rename.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan project = LogicalPlanDSL.project(relation, named("ref", ref)); - assertNull(project.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan remove = LogicalPlanDSL.remove(relation, ref); - assertNull(remove.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan eval = LogicalPlanDSL.eval(relation, Pair.of(ref, expression)); - assertNull(eval.accept(new LogicalPlanNodeVisitor() { - }, null)); - - LogicalPlan sort = LogicalPlanDSL.sort(relation, - Pair.of(SortOption.DEFAULT_ASC, expression)); - assertNull(sort.accept(new LogicalPlanNodeVisitor() { - }, null)); - + LogicalPlan sort = LogicalPlanDSL.sort(relation, Pair.of(SortOption.DEFAULT_ASC, expression)); LogicalPlan dedup = LogicalPlanDSL.dedupe(relation, 1, false, false, expression); - assertNull(dedup.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan window = LogicalPlanDSL.window(relation, named(expression), new WindowDefinition( ImmutableList.of(ref), ImmutableList.of(Pair.of(SortOption.DEFAULT_ASC, expression)))); - assertNull(window.accept(new LogicalPlanNodeVisitor() { - }, null)); - LogicalPlan rareTopN = LogicalPlanDSL.rareTopN( relation, CommandType.TOP, ImmutableList.of(expression), expression); - assertNull(rareTopN.accept(new LogicalPlanNodeVisitor() { - }, null)); - - Map args = new HashMap<>(); LogicalPlan highlight = new LogicalHighlight(filter, - new LiteralExpression(ExprValueUtils.stringValue("fieldA")), args); - assertNull(highlight.accept(new LogicalPlanNodeVisitor() { - }, null)); - - LogicalPlan mlCommons = new LogicalMLCommons(LogicalPlanDSL.relation("schema", table), - "kmeans", - ImmutableMap.builder() - .put("centroids", new Literal(3, DataType.INTEGER)) - .put("iterations", new Literal(3, DataType.DOUBLE)) - .put("distance_type", new Literal(null, DataType.STRING)) - .build()); - assertNull(mlCommons.accept(new LogicalPlanNodeVisitor() { - }, null)); - - LogicalPlan ad = new LogicalAD(LogicalPlanDSL.relation("schema", table), - new HashMap() {{ - put("shingle_size", new Literal(8, DataType.INTEGER)); - put("time_decay", new Literal(0.0001, DataType.DOUBLE)); - put("time_field", new Literal(null, DataType.STRING)); - } - }); - assertNull(ad.accept(new LogicalPlanNodeVisitor() { - }, null)); + new LiteralExpression(ExprValueUtils.stringValue("fieldA")), Map.of()); + LogicalPlan mlCommons = new LogicalMLCommons(relation, "kmeans", Map.of()); + LogicalPlan ad = new LogicalAD(relation, Map.of()); + LogicalPlan ml = new LogicalML(relation, Map.of()); + LogicalPlan paginate = new LogicalPaginate(42, List.of(relation)); + + return Stream.of( + relation, tableScanBuilder, write, tableWriteBuilder, filter, aggregation, rename, project, + remove, eval, sort, dedup, window, rareTopN, highlight, mlCommons, ad, ml, paginate + ).map(Arguments::of); + } - LogicalPlan ml = new LogicalML(LogicalPlanDSL.relation("schema", table), - new HashMap() {{ - put("action", new Literal("train", DataType.STRING)); - put("algorithm", new Literal("rcf", DataType.STRING)); - put("shingle_size", new Literal(8, DataType.INTEGER)); - put("time_decay", new Literal(0.0001, DataType.DOUBLE)); - put("time_field", new Literal(null, DataType.STRING)); - } - }); - assertNull(ml.accept(new LogicalPlanNodeVisitor() { + @ParameterizedTest + @MethodSource("getLogicalPlansForVisitorTest") + public void abstract_plan_node_visitor_should_return_null(LogicalPlan plan) { + assertNull(plan.accept(new LogicalPlanNodeVisitor() { }, null)); } + private static class NodesCount extends LogicalPlanNodeVisitor { @Override public Integer visitRelation(LogicalRelation plan, Object context) { @@ -195,32 +139,28 @@ public Integer visitRelation(LogicalRelation plan, Object context) { public Integer visitFilter(LogicalFilter plan, Object context) { return 1 + plan.getChild().stream() - .map(child -> child.accept(this, context)) - .collect(Collectors.summingInt(Integer::intValue)); + .map(child -> child.accept(this, context)).mapToInt(Integer::intValue).sum(); } @Override public Integer visitAggregation(LogicalAggregation plan, Object context) { return 1 + plan.getChild().stream() - .map(child -> child.accept(this, context)) - .collect(Collectors.summingInt(Integer::intValue)); + .map(child -> child.accept(this, context)).mapToInt(Integer::intValue).sum(); } @Override public Integer visitRename(LogicalRename plan, Object context) { return 1 + plan.getChild().stream() - .map(child -> child.accept(this, context)) - .collect(Collectors.summingInt(Integer::intValue)); + .map(child -> child.accept(this, context)).mapToInt(Integer::intValue).sum(); } @Override public Integer visitRareTopN(LogicalRareTopN plan, Object context) { return 1 + plan.getChild().stream() - .map(child -> child.accept(this, context)) - .collect(Collectors.summingInt(Integer::intValue)); + .map(child -> child.accept(this, context)).mapToInt(Integer::intValue).sum(); } } } diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java index 7516aa18091..1ee9b9aa3b1 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java @@ -7,8 +7,11 @@ package org.opensearch.sql.planner.optimizer; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyInt; +import static org.mockito.Mockito.lenient; import static org.mockito.Mockito.when; import static org.opensearch.sql.data.model.ExprValueUtils.integerValue; import static org.opensearch.sql.data.model.ExprValueUtils.longValue; @@ -26,9 +29,12 @@ import com.google.common.collect.ImmutableList; import java.util.Collections; +import java.util.List; import java.util.Map; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -38,13 +44,16 @@ import org.opensearch.sql.ast.tree.Sort; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.logical.LogicalRelation; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.Table; import org.opensearch.sql.storage.read.TableScanBuilder; import org.opensearch.sql.storage.write.TableWriteBuilder; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class LogicalPlanOptimizerTest { @Mock @@ -55,7 +64,7 @@ class LogicalPlanOptimizerTest { @BeforeEach void setUp() { - when(table.createScanBuilder()).thenReturn(tableScanBuilder); + lenient().when(table.createScanBuilder()).thenReturn(tableScanBuilder); } /** @@ -255,7 +264,6 @@ void table_scan_builder_support_highlight_push_down_can_apply_its_rule() { @Test void table_not_support_scan_builder_should_not_be_impact() { - Mockito.reset(table, tableScanBuilder); Table table = new Table() { @Override public Map getFieldTypes() { @@ -276,7 +284,6 @@ public PhysicalPlan implement(LogicalPlan plan) { @Test void table_support_write_builder_should_be_replaced() { - Mockito.reset(table, tableScanBuilder); TableWriteBuilder writeBuilder = Mockito.mock(TableWriteBuilder.class); when(table.createWriteBuilder(any())).thenReturn(writeBuilder); @@ -288,7 +295,6 @@ void table_support_write_builder_should_be_replaced() { @Test void table_not_support_write_builder_should_report_error() { - Mockito.reset(table, tableScanBuilder); Table table = new Table() { @Override public Map getFieldTypes() { @@ -305,6 +311,52 @@ public PhysicalPlan implement(LogicalPlan plan) { () -> table.createWriteBuilder(null)); } + @Test + void paged_table_scan_builder_support_project_push_down_can_apply_its_rule() { + when(tableScanBuilder.pushDownProject(any())).thenReturn(true); + when(table.createPagedScanBuilder(anyInt())).thenReturn(tableScanBuilder); + + var relation = new LogicalRelation("schema", table); + relation.setPageSize(anyInt()); + + assertEquals( + tableScanBuilder, + LogicalPlanOptimizer.paginationCreate().optimize(project(relation)) + ); + } + + @Test + void push_page_size() { + var relation = new LogicalRelation("schema", table); + var paginate = new LogicalPaginate(42, List.of(project(relation))); + assertNull(relation.getPageSize()); + LogicalPlanOptimizer.paginationCreate().optimize(paginate); + assertEquals(42, relation.getPageSize()); + } + + @Test + void push_page_size_noop_if_no_relation() { + var paginate = new LogicalPaginate(42, List.of(project(values()))); + LogicalPlanOptimizer.paginationCreate().optimize(paginate); + } + + @Test + void push_page_size_noop_if_no_sub_plans() { + var paginate = new LogicalPaginate(42, List.of()); + LogicalPlanOptimizer.paginationCreate().optimize(paginate); + } + + @Test + void table_scan_builder_support_offset_push_down_can_apply_its_rule() { + when(table.createPagedScanBuilder(anyInt())).thenReturn(tableScanBuilder); + + var optimized = LogicalPlanOptimizer.paginationCreate() + .optimize(new LogicalPaginate(42, List.of(project(relation("schema", table))))); + // `optimized` structure: LogicalPaginate -> LogicalProject -> TableScanBuilder + // LogicalRelation replaced by a TableScanBuilder instance + assertEquals(tableScanBuilder, optimized.getChild().get(0).getChild().get(0)); + } + private LogicalPlan optimize(LogicalPlan plan) { final LogicalPlanOptimizer optimizer = LogicalPlanOptimizer.create(); final LogicalPlan optimize = optimizer.optimize(plan); diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java index 9f90fd8d055..1fd572e7daf 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java @@ -6,35 +6,49 @@ package org.opensearch.sql.planner.optimizer.pattern; +import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; import java.util.Collections; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.Mock; -import org.mockito.Mockito; -import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.sql.planner.logical.LogicalFilter; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; -@ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class PatternsTest { - @Mock - LogicalPlan plan; - @Test void source_is_empty() { + var plan = mock(LogicalPlan.class); when(plan.getChild()).thenReturn(Collections.emptyList()); - assertFalse(Patterns.source().getFunction().apply(plan).isPresent()); - assertFalse(Patterns.source(null).getProperty().getFunction().apply(plan).isPresent()); + assertAll( + () -> assertFalse(Patterns.source().getFunction().apply(plan).isPresent()), + () -> assertFalse(Patterns.source(null).getProperty().getFunction().apply(plan).isPresent()) + ); } @Test void table_is_empty() { - plan = Mockito.mock(LogicalFilter.class); - assertFalse(Patterns.table().getFunction().apply(plan).isPresent()); - assertFalse(Patterns.writeTable().getFunction().apply(plan).isPresent()); + var plan = mock(LogicalFilter.class); + assertAll( + () -> assertFalse(Patterns.table().getFunction().apply(plan).isPresent()), + () -> assertFalse(Patterns.writeTable().getFunction().apply(plan).isPresent()) + ); + } + + @Test + void pagination() { + assertAll( + () -> assertTrue(Patterns.pagination().getFunction() + .apply(mock(LogicalPaginate.class)).isPresent()), + () -> assertFalse(Patterns.pagination().getFunction() + .apply(mock(LogicalFilter.class)).isPresent()) + ); } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/FilterOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/FilterOperatorTest.java index 288b4bf661f..f541f6a15f1 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/FilterOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/FilterOperatorTest.java @@ -17,22 +17,30 @@ import com.google.common.collect.ImmutableMap; import java.util.LinkedHashMap; import java.util.List; +import java.util.Map; +import java.util.stream.Collectors; +import java.util.stream.Stream; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.AdditionalAnswers; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.data.model.ExprIntegerValue; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.data.model.ExprValueUtils; import org.opensearch.sql.expression.DSL; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class FilterOperatorTest extends PhysicalPlanTestBase { @Mock private PhysicalPlan inputPlan; @Test - public void filterTest() { + public void filter_test() { FilterOperator plan = new FilterOperator(new TestScan(), DSL.equal(DSL.ref("response", INTEGER), DSL.literal(404))); List result = execute(plan); @@ -41,10 +49,11 @@ public void filterTest() { .tupleValue(ImmutableMap .of("ip", "209.160.24.63", "action", "GET", "response", 404, "referer", "www.amazon.com")))); + assertEquals(1, plan.getTotalHits()); } @Test - public void nullValueShouldBeenIgnored() { + public void null_value_should_been_ignored() { LinkedHashMap value = new LinkedHashMap<>(); value.put("response", LITERAL_NULL); when(inputPlan.hasNext()).thenReturn(true, false); @@ -54,10 +63,11 @@ public void nullValueShouldBeenIgnored() { DSL.equal(DSL.ref("response", INTEGER), DSL.literal(404))); List result = execute(plan); assertEquals(0, result.size()); + assertEquals(0, plan.getTotalHits()); } @Test - public void missingValueShouldBeenIgnored() { + public void missing_value_should_been_ignored() { LinkedHashMap value = new LinkedHashMap<>(); value.put("response", LITERAL_MISSING); when(inputPlan.hasNext()).thenReturn(true, false); @@ -67,5 +77,21 @@ public void missingValueShouldBeenIgnored() { DSL.equal(DSL.ref("response", INTEGER), DSL.literal(404))); List result = execute(plan); assertEquals(0, result.size()); + assertEquals(0, plan.getTotalHits()); + } + + @Test + public void totalHits() { + when(inputPlan.hasNext()).thenReturn(true, true, true, true, true, false); + var answers = Stream.of(200, 240, 300, 403, 404).map(c -> + new ExprTupleValue(new LinkedHashMap<>(Map.of("response", new ExprIntegerValue(c))))) + .collect(Collectors.toList()); + when(inputPlan.next()).thenAnswer(AdditionalAnswers.returnsElementsOf(answers)); + + FilterOperator plan = new FilterOperator(inputPlan, + DSL.less(DSL.ref("response", INTEGER), DSL.literal(400))); + List result = execute(plan); + assertEquals(3, result.size()); + assertEquals(3, plan.getTotalHits()); } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java new file mode 100644 index 00000000000..3e0efc3b50b --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java @@ -0,0 +1,104 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + + +package org.opensearch.sql.planner.physical; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNull; +import static org.junit.jupiter.api.Assertions.assertSame; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.doNothing; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.withSettings; +import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; +import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.project; + +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.opensearch.sql.data.model.ExprIntegerValue; +import org.opensearch.sql.expression.DSL; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class PaginateOperatorTest extends PhysicalPlanTestBase { + + @Test + public void accept() { + var visitor = new PhysicalPlanNodeVisitor() {}; + assertNull(new PaginateOperator(null, 42).accept(visitor, null)); + } + + @Test + public void hasNext_a_page() { + var plan = mock(PhysicalPlan.class); + when(plan.hasNext()).thenReturn(true); + when(plan.next()).thenReturn(new ExprIntegerValue(42)).thenReturn(null); + var paginate = new PaginateOperator(plan, 1, 1); + assertTrue(paginate.hasNext()); + assertEquals(42, paginate.next().integerValue()); + paginate.next(); + assertFalse(paginate.hasNext()); + assertNull(paginate.next()); + } + + @Test + public void hasNext_no_more_entries() { + var plan = mock(PhysicalPlan.class); + when(plan.hasNext()).thenReturn(false); + var paginate = new PaginateOperator(plan, 1, 1); + assertFalse(paginate.hasNext()); + } + + @Test + public void getChild() { + var plan = mock(PhysicalPlan.class); + var paginate = new PaginateOperator(plan, 1); + assertSame(plan, paginate.getChild().get(0)); + } + + @Test + public void open() { + var plan = mock(PhysicalPlan.class); + doNothing().when(plan).open(); + new PaginateOperator(plan, 1).open(); + verify(plan, times(1)).open(); + } + + @Test + public void schema() { + PhysicalPlan project = project(null, + DSL.named("response", DSL.ref("response", INTEGER)), + DSL.named("action", DSL.ref("action", STRING), "act")); + assertEquals(project.schema(), new PaginateOperator(project, 42).schema()); + } + + @Test + public void schema_assert() { + var plan = mock(PhysicalPlan.class, withSettings().defaultAnswer(CALLS_REAL_METHODS)); + assertThrows(Throwable.class, () -> new PaginateOperator(plan, 42).schema()); + } + + @Test + public void toCursor() { + var plan = mock(PhysicalPlan.class); + when(plan.toCursor()).thenReturn("Great plan, Walter, reliable as a swiss watch!", "", null); + var po = new PaginateOperator(plan, 2); + assertAll( + () -> assertEquals("(Paginate,1,2,Great plan, Walter, reliable as a swiss watch!)", + po.toCursor()), + () -> assertNull(po.toCursor()), + () -> assertNull(po.toCursor()) + ); + } +} diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java index 735b914d3e1..3dfe0b5c0fd 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java @@ -158,6 +158,14 @@ public void test_visitML() { assertNull(physicalPlanNodeVisitor.visitML(plan, null)); } + @Test + public void test_visitPaginate() { + PhysicalPlanNodeVisitor physicalPlanNodeVisitor = + new PhysicalPlanNodeVisitor() {}; + + assertNull(physicalPlanNodeVisitor.visitPaginate(new PaginateOperator(plan, 42), null)); + } + public static class PhysicalPlanPrinter extends PhysicalPlanNodeVisitor { public String print(PhysicalPlan node) { diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java index 0a93c96bbb2..5e70f2b9d01 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java @@ -5,9 +5,19 @@ package org.opensearch.sql.planner.physical; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; import java.util.List; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -16,6 +26,7 @@ import org.opensearch.sql.storage.split.Split; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class PhysicalPlanTest { @Mock Split split; @@ -46,8 +57,40 @@ public List getChild() { }; @Test - void addSplitToChildByDefault() { + void add_split_to_child_by_default() { testPlan.add(split); verify(child).add(split); } + + @Test + void get_total_hits_from_child() { + var plan = mock(PhysicalPlan.class); + when(child.getTotalHits()).thenReturn(42L); + when(plan.getChild()).thenReturn(List.of(child)); + when(plan.getTotalHits()).then(CALLS_REAL_METHODS); + assertEquals(42, plan.getTotalHits()); + verify(child).getTotalHits(); + } + + @Test + void get_total_hits_uses_default_value() { + var plan = mock(PhysicalPlan.class); + when(plan.getTotalHits()).then(CALLS_REAL_METHODS); + assertEquals(0, plan.getTotalHits()); + } + + @Test + void toCursor() { + var plan = mock(PhysicalPlan.class); + when(plan.toCursor()).then(CALLS_REAL_METHODS); + assertTrue(assertThrows(IllegalStateException.class, plan::toCursor) + .getMessage().contains("is not compatible with cursor feature")); + } + + @Test + void createSection() { + var plan = mock(PhysicalPlan.class); + when(plan.createSection(anyString(), any())).then(CALLS_REAL_METHODS); + assertEquals("(plan,one,two)", plan.createSection("plan", "one", "two")); + } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java index 24be5eb2b8d..6042eba6dcc 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java @@ -11,6 +11,9 @@ import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.hasItems; import static org.hamcrest.Matchers.iterableWithSize; +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.mockito.Mockito.when; import static org.opensearch.sql.data.model.ExprValueUtils.LITERAL_MISSING; import static org.opensearch.sql.data.model.ExprValueUtils.stringValue; @@ -30,11 +33,12 @@ import org.opensearch.sql.data.model.ExprValueUtils; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; @ExtendWith(MockitoExtension.class) class ProjectOperatorTest extends PhysicalPlanTestBase { - @Mock + @Mock(serializable = true) private PhysicalPlan inputPlan; @Test @@ -206,4 +210,20 @@ public void project_parse_missing_will_fallback() { ExprValueUtils.tupleValue(ImmutableMap.of("action", "GET", "response", "200")), ExprValueUtils.tupleValue(ImmutableMap.of("action", "POST"))))); } + + @Test + public void toCursor() { + when(inputPlan.toCursor()).thenReturn("inputPlan", "", null); + var project = DSL.named("response", DSL.ref("response", INTEGER)); + var npe = DSL.named("action", DSL.ref("action", STRING)); + var po = project(inputPlan, List.of(project), List.of(npe)); + var serializer = new DefaultExpressionSerializer(); + var expected = String.format("(Project,(namedParseExpressions,%s),(projectList,%s),%s)", + serializer.serialize(npe), serializer.serialize(project), "inputPlan"); + assertAll( + () -> assertEquals(expected, po.toCursor()), + () -> assertNull(po.toCursor()), + () -> assertNull(po.toCursor()) + ); + } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/RemoveOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/RemoveOperatorTest.java index bf046bf0a6b..ec950e6016b 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/RemoveOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/RemoveOperatorTest.java @@ -113,12 +113,11 @@ public void remove_nothing_with_none_tuple_value() { @Test public void invalid_to_retrieve_schema_from_remove() { - PhysicalPlan plan = remove(inputPlan, DSL.ref("response", STRING), DSL.ref("referer", STRING)); + PhysicalPlan plan = remove(inputPlan); IllegalStateException exception = assertThrows(IllegalStateException.class, () -> plan.schema()); assertEquals( - "[BUG] schema can been only applied to ProjectOperator, " - + "instead of RemoveOperator(input=inputPlan, removeList=[response, referer])", + "[BUG] schema can been only applied to ProjectOperator, instead of RemoveOperator", exception.getMessage()); } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/ValuesOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/ValuesOperatorTest.java index 9acab03d2bd..bf6d28a23c6 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/ValuesOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/ValuesOperatorTest.java @@ -9,6 +9,7 @@ import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.empty; +import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.is; import static org.opensearch.sql.data.model.ExprValueUtils.collectionValue; import static org.opensearch.sql.expression.DSL.literal; @@ -44,6 +45,7 @@ public void iterateSingleRow() { results, contains(collectionValue(Arrays.asList(1, "abc"))) ); + assertThat(values.getTotalHits(), equalTo(1L)); } } diff --git a/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java b/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java index 0e969c6dac3..9c96459d061 100644 --- a/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java +++ b/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java @@ -13,11 +13,16 @@ public class StorageEngineTest { - @Test void testFunctionsMethod() { StorageEngine k = (dataSourceSchemaName, tableName) -> null; Assertions.assertEquals(Collections.emptyList(), k.getFunctions()); } + @Test + void getTableScan() { + StorageEngine k = (dataSourceSchemaName, tableName) -> null; + Assertions.assertThrows(UnsupportedOperationException.class, + () -> k.getTableScan("indexName", "scrollId")); + } } diff --git a/core/src/test/java/org/opensearch/sql/storage/TableTest.java b/core/src/test/java/org/opensearch/sql/storage/TableTest.java new file mode 100644 index 00000000000..2a2b5550145 --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/storage/TableTest.java @@ -0,0 +1,25 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.storage; + +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.withSettings; + +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.mockito.invocation.InvocationOnMock; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class TableTest { + + @Test + public void createPagedScanBuilder_throws() { + var table = mock(Table.class, withSettings().defaultAnswer(InvocationOnMock::callRealMethod)); + assertThrows(Throwable.class, () -> table.createPagedScanBuilder(0)); + } +} diff --git a/core/src/testFixtures/java/org/opensearch/sql/executor/DefaultExecutionEngine.java b/core/src/testFixtures/java/org/opensearch/sql/executor/DefaultExecutionEngine.java index e4f9a185a30..3849d686a6a 100644 --- a/core/src/testFixtures/java/org/opensearch/sql/executor/DefaultExecutionEngine.java +++ b/core/src/testFixtures/java/org/opensearch/sql/executor/DefaultExecutionEngine.java @@ -9,6 +9,7 @@ import java.util.List; import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.planner.physical.PhysicalPlan; /** @@ -32,7 +33,8 @@ public void execute( while (plan.hasNext()) { result.add(plan.next()); } - QueryResponse response = new QueryResponse(new Schema(new ArrayList<>()), new ArrayList<>()); + QueryResponse response = new QueryResponse(new Schema(new ArrayList<>()), new ArrayList<>(), + 0, Cursor.None); listener.onResponse(response); } catch (Exception e) { listener.onFailure(e); diff --git a/integ-test/build.gradle b/integ-test/build.gradle index 5a707a17b0d..0f1ee8cb1c2 100644 --- a/integ-test/build.gradle +++ b/integ-test/build.gradle @@ -120,6 +120,11 @@ compileTestJava { testClusters.all { testDistribution = 'archive' + + // debug with command, ./gradlew opensearch-sql:run -DdebugJVM. --debug-jvm does not work with keystore. + if (System.getProperty("debugJVM") != null) { + jvmArgs '-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005' + } } testClusters.integTest { @@ -178,10 +183,16 @@ integTest { // Tell the test JVM if the cluster JVM is running under a debugger so that tests can use longer timeouts for // requests. The 'doFirst' delays reading the debug setting on the cluster till execution time. - doFirst { systemProperty 'cluster.debug', getDebug() } + doFirst { + if (System.getProperty("debug-jvm") != null) { + setDebug(true); + } + systemProperty 'cluster.debug', getDebug() + } + if (System.getProperty("test.debug") != null) { - jvmArgs '-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=*:5005' + jvmArgs '-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=*:5006' } if (System.getProperty("tests.rest.bwcsuite") == null) { diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/CursorIT.java b/integ-test/src/test/java/org/opensearch/sql/legacy/CursorIT.java index 113a19885a7..5b9a583d04b 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/CursorIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/CursorIT.java @@ -123,11 +123,16 @@ public void validNumberOfPages() throws IOException { String selectQuery = StringUtils.format("SELECT firstname, state FROM %s", TEST_INDEX_ACCOUNT); JSONObject response = new JSONObject(executeFetchQuery(selectQuery, 50, JDBC)); String cursor = response.getString(CURSOR); + verifyIsV1Cursor(cursor); + int pageCount = 1; while (!cursor.isEmpty()) { //this condition also checks that there is no cursor on last page response = executeCursorQuery(cursor); cursor = response.optString(CURSOR); + if (!cursor.isEmpty()) { + verifyIsV1Cursor(cursor); + } pageCount++; } @@ -136,12 +141,16 @@ public void validNumberOfPages() throws IOException { // using random value here, with fetch size of 28 we should get 36 pages (ceil of 1000/28) response = new JSONObject(executeFetchQuery(selectQuery, 28, JDBC)); cursor = response.getString(CURSOR); + verifyIsV1Cursor(cursor); System.out.println(response); pageCount = 1; while (!cursor.isEmpty()) { response = executeCursorQuery(cursor); cursor = response.optString(CURSOR); + if (!cursor.isEmpty()) { + verifyIsV1Cursor(cursor); + } pageCount++; } assertThat(pageCount, equalTo(36)); @@ -223,6 +232,7 @@ public void testCursorWithPreparedStatement() throws IOException { "}", TestsConstants.TEST_INDEX_ACCOUNT)); assertTrue(response.has(CURSOR)); + verifyIsV1Cursor(response.getString(CURSOR)); } @Test @@ -244,11 +254,13 @@ public void testRegressionOnDateFormatChange() throws IOException { StringUtils.format("SELECT login_time FROM %s LIMIT 500", TEST_INDEX_DATE_TIME); JSONObject response = new JSONObject(executeFetchQuery(selectQuery, 1, JDBC)); String cursor = response.getString(CURSOR); + verifyIsV1Cursor(cursor); actualDateList.add(response.getJSONArray(DATAROWS).getJSONArray(0).getString(0)); while (!cursor.isEmpty()) { response = executeCursorQuery(cursor); cursor = response.optString(CURSOR); + verifyIsV1Cursor(cursor); actualDateList.add(response.getJSONArray(DATAROWS).getJSONArray(0).getString(0)); } @@ -274,7 +286,6 @@ public void defaultBehaviorWhenCursorSettingIsDisabled() throws IOException { query = StringUtils.format("SELECT firstname, email, state FROM %s", TEST_INDEX_ACCOUNT); response = new JSONObject(executeFetchQuery(query, 100, JDBC)); assertTrue(response.has(CURSOR)); - wipeAllClusterSettings(); } @@ -305,12 +316,14 @@ public void testDefaultFetchSizeFromClusterSettings() throws IOException { JSONObject response = new JSONObject(executeFetchLessQuery(query, JDBC)); JSONArray datawRows = response.optJSONArray(DATAROWS); assertThat(datawRows.length(), equalTo(1000)); + verifyIsV1Cursor(response.getString(CURSOR)); updateClusterSettings(new ClusterSetting(TRANSIENT, "opensearch.sql.cursor.fetch_size", "786")); response = new JSONObject(executeFetchLessQuery(query, JDBC)); datawRows = response.optJSONArray(DATAROWS); assertThat(datawRows.length(), equalTo(786)); assertTrue(response.has(CURSOR)); + verifyIsV1Cursor(response.getString(CURSOR)); wipeAllClusterSettings(); } @@ -323,11 +336,12 @@ public void testCursorCloseAPI() throws IOException { "SELECT firstname, state FROM %s WHERE balance > 100 and age < 40", TEST_INDEX_ACCOUNT); JSONObject result = new JSONObject(executeFetchQuery(selectQuery, 50, JDBC)); String cursor = result.getString(CURSOR); - + verifyIsV1Cursor(cursor); // Retrieving next 10 pages out of remaining 19 pages for (int i = 0; i < 10; i++) { result = executeCursorQuery(cursor); cursor = result.optString(CURSOR); + verifyIsV1Cursor(cursor); } //Closing the cursor JSONObject closeResp = executeCursorCloseQuery(cursor); @@ -386,12 +400,14 @@ public void respectLimitPassedInSelectClause() throws IOException { StringUtils.format("SELECT age, balance FROM %s LIMIT %s", TEST_INDEX_ACCOUNT, limit); JSONObject response = new JSONObject(executeFetchQuery(selectQuery, 50, JDBC)); String cursor = response.getString(CURSOR); + verifyIsV1Cursor(cursor); int actualDataRowCount = response.getJSONArray(DATAROWS).length(); int pageCount = 1; while (!cursor.isEmpty()) { response = executeCursorQuery(cursor); cursor = response.optString(CURSOR); + verifyIsV1Cursor(cursor); actualDataRowCount += response.getJSONArray(DATAROWS).length(); pageCount++; } @@ -432,10 +448,12 @@ public void verifyWithAndWithoutPaginationResponse(String sqlQuery, String curso response.optJSONArray(DATAROWS).forEach(dataRows::put); String cursor = response.getString(CURSOR); + verifyIsV1Cursor(cursor); while (!cursor.isEmpty()) { response = executeCursorQuery(cursor); response.optJSONArray(DATAROWS).forEach(dataRows::put); cursor = response.optString(CURSOR); + verifyIsV1Cursor(cursor); } verifySchema(withoutCursorResponse.optJSONArray(SCHEMA), @@ -465,6 +483,13 @@ public String executeFetchAsStringQuery(String query, String fetchSize, String r return responseString; } + private void verifyIsV1Cursor(String cursor) { + if (cursor.isEmpty()) { + return; + } + assertTrue("The cursor '" + cursor + "' is not from v1 engine.", cursor.startsWith("d:")); + } + private String makeRequest(String query, String fetch_size) { return String.format("{" + " \"fetch_size\": \"%s\"," + diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java b/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java index 360497300e3..dbd37835a7a 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java @@ -257,6 +257,17 @@ protected String executeFetchQuery(String query, int fetchSize, String requestTy return responseString; } + protected JSONObject executeQueryTemplate(String queryTemplate, String index, int fetchSize) + throws IOException { + var query = String.format(queryTemplate, index); + return new JSONObject(executeFetchQuery(query, fetchSize, "jdbc")); + } + + protected JSONObject executeQueryTemplate(String queryTemplate, String index) throws IOException { + var query = String.format(queryTemplate, index); + return executeQueryTemplate(queryTemplate, index, 4); + } + protected String executeFetchLessQuery(String query, String requestType) throws IOException { String endpoint = "/_plugins/_sql?format=" + requestType; diff --git a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java index 0c900ea2340..ee568b7dbd4 100644 --- a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java @@ -29,40 +29,41 @@ import org.opensearch.common.inject.Singleton; import org.opensearch.sql.analysis.Analyzer; import org.opensearch.sql.analysis.ExpressionAnalyzer; -import org.opensearch.sql.common.response.ResponseListener; -import org.opensearch.sql.common.setting.Settings; -import org.opensearch.sql.datasource.DataSourceMetadataStorage; -import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; -import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; -import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.ExecutionEngine.QueryResponse; import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; import org.opensearch.sql.monitor.AlwaysHealthyMonitor; import org.opensearch.sql.monitor.ResourceMonitor; -import org.opensearch.sql.opensearch.client.OpenSearchClient; -import org.opensearch.sql.opensearch.client.OpenSearchRestClient; import org.opensearch.sql.opensearch.executor.OpenSearchExecutionEngine; import org.opensearch.sql.opensearch.executor.protector.ExecutionProtector; import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; -import org.opensearch.sql.opensearch.security.SecurityAccess; -import org.opensearch.sql.opensearch.storage.OpenSearchDataSourceFactory; import org.opensearch.sql.opensearch.storage.OpenSearchStorageEngine; import org.opensearch.sql.planner.Planner; import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; import org.opensearch.sql.ppl.antlr.PPLSyntaxParser; -import org.opensearch.sql.ppl.domain.PPLQueryRequest; -import org.opensearch.sql.protocol.response.QueryResult; -import org.opensearch.sql.protocol.response.format.SimpleJsonResponseFormatter; import org.opensearch.sql.sql.SQLService; import org.opensearch.sql.sql.antlr.SQLSyntaxParser; -import org.opensearch.sql.storage.DataSourceFactory; import org.opensearch.sql.storage.StorageEngine; import org.opensearch.sql.util.ExecuteOnCallerThreadQueryManager; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.datasource.DataSourceMetadataStorage; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasource.DataSourceServiceImpl; +import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.ExecutionEngine.QueryResponse; +import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.client.OpenSearchRestClient; +import org.opensearch.sql.opensearch.security.SecurityAccess; +import org.opensearch.sql.opensearch.storage.OpenSearchDataSourceFactory; +import org.opensearch.sql.ppl.domain.PPLQueryRequest; +import org.opensearch.sql.protocol.response.QueryResult; +import org.opensearch.sql.protocol.response.format.SimpleJsonResponseFormatter; +import org.opensearch.sql.storage.DataSourceFactory; /** * Run PPL with query engine outside OpenSearch cluster. This IT doesn't require our plugin @@ -71,13 +72,11 @@ */ public class StandaloneIT extends PPLIntegTestCase { - private RestHighLevelClient restClient; - private PPLService pplService; @Override public void init() { - restClient = new InternalRestHighLevelClient(client()); + RestHighLevelClient restClient = new InternalRestHighLevelClient(client()); OpenSearchClient client = new OpenSearchRestClient(restClient); DataSourceService dataSourceService = new DataSourceServiceImpl( new ImmutableSet.Builder() @@ -198,8 +197,9 @@ public StorageEngine storageEngine(OpenSearchClient client) { } @Provides - public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector) { - return new OpenSearchExecutionEngine(client, protector); + public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, + PaginatedPlanCache paginatedPlanCache) { + return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); } @Provides @@ -229,17 +229,24 @@ public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPl } @Provides - public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine) { + public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { + return new PaginatedPlanCache(storageEngine); + } + + @Provides + public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine, + PaginatedPlanCache paginatedPlanCache) { Analyzer analyzer = new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); - return new QueryPlanFactory(new QueryService(analyzer, executionEngine, planner)); + Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); + QueryService queryService = new QueryService(analyzer, executionEngine, planner, paginationPlanner); + return new QueryPlanFactory(queryService, paginatedPlanCache); } } - - private DataSourceMetadataStorage getDataSourceMetadataStorage() { + public static DataSourceMetadataStorage getDataSourceMetadataStorage() { return new DataSourceMetadataStorage() { @Override public List getDataSourceMetadata() { @@ -268,7 +275,7 @@ public void deleteDataSourceMetadata(String datasourceName) { }; } - private DataSourceUserAuthorizationHelper getDataSourceUserRoleHelper() { + public static DataSourceUserAuthorizationHelper getDataSourceUserRoleHelper() { return new DataSourceUserAuthorizationHelper() { @Override public void authorizeDataSource(DataSourceMetadata dataSourceMetadata) { @@ -276,5 +283,4 @@ public void authorizeDataSource(DataSourceMetadata dataSourceMetadata) { } }; } - } diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/HighlightFunctionIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/HighlightFunctionIT.java index 809e2dc7c51..0ab6d5c70fb 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/HighlightFunctionIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/HighlightFunctionIT.java @@ -64,7 +64,7 @@ public void highlight_multiple_optional_arguments_test() { schema("highlight(Body, pre_tags='', " + "post_tags='')", null, "nested")); - assertEquals(1, response.getInt("total")); + assertEquals(1, response.getInt("size")); verifyDataRows(response, rows(new JSONArray(List.of("What are the differences between an IPA" + " and its variants?")), diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationBlackboxIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationBlackboxIT.java new file mode 100644 index 00000000000..d8213b1fe44 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationBlackboxIT.java @@ -0,0 +1,117 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.legacy.TestUtils.getResponseBody; +import static org.opensearch.sql.legacy.TestUtils.isIndexExist; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_ONLINE; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.stream.Collectors; + +import com.carrotsearch.randomizedtesting.annotations.Name; +import com.carrotsearch.randomizedtesting.annotations.ParametersFactory; +import lombok.SneakyThrows; +import org.json.JSONArray; +import org.json.JSONObject; +import org.junit.Test; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.opensearch.client.Request; +import org.opensearch.sql.legacy.SQLIntegTestCase; + +// This class has only one test case, because it is parametrized and takes significant time +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class PaginationBlackboxIT extends SQLIntegTestCase { + + private final String index; + private final Integer pageSize; + + public PaginationBlackboxIT(@Name("index") String index, + @Name("pageSize") Integer pageSize) { + this.index = index; + this.pageSize = pageSize; + } + + @ParametersFactory(argumentFormatting = "index = %1$s, page_size = %2$d") + public static Iterable compareTwoDates() { + var indices = new PaginationBlackboxHelper().getIndices(); + var pageSizes = List.of(5, 10, 100, 1000); + var testData = new ArrayList(); + for (var index : indices) { + for (var pageSize : pageSizes) { + testData.add(new Object[] { index, pageSize }); + } + } + return testData; + } + + @Test + @SneakyThrows + public void test_pagination_blackbox() { + var response = executeJdbcRequest(String.format("select * from %s", index)); + var indexSize = response.getInt("total"); + var rows = response.getJSONArray("datarows"); + var schema = response.getJSONArray("schema"); + var testReportPrefix = String.format("index: %s, page size: %d || ", index, pageSize); + var rowsPaged = new JSONArray(); + var rowsReturned = 0; + response = new JSONObject(executeFetchQuery( + String.format("select * from %s", index), pageSize, "jdbc")); + var responseCounter = 1; + this.logger.info(testReportPrefix + "first response"); + while (response.has("cursor")) { + assertEquals(indexSize, response.getInt("total")); + assertTrue("Paged response schema doesn't match to non-paged", + schema.similar(response.getJSONArray("schema"))); + var cursor = response.getString("cursor"); + assertTrue(testReportPrefix + "Cursor returned from legacy engine", + cursor.startsWith("n:")); + rowsReturned += response.getInt("size"); + var datarows = response.getJSONArray("datarows"); + for (int i = 0; i < datarows.length(); i++) { + rowsPaged.put(datarows.get(i)); + } + response = executeCursorQuery(cursor); + this.logger.info(testReportPrefix + + String.format("subsequent response %d/%d", responseCounter++, (indexSize / pageSize) + 1)); + } + assertTrue("Paged response schema doesn't match to non-paged", + schema.similar(response.getJSONArray("schema"))); + assertEquals(0, response.getInt("total")); + + assertEquals(testReportPrefix + "Last page is not empty", + 0, response.getInt("size")); + assertEquals(testReportPrefix + "Last page is not empty", + 0, response.getJSONArray("datarows").length()); + assertEquals(testReportPrefix + "Paged responses return another row count that non-paged", + indexSize, rowsReturned); + assertTrue(testReportPrefix + "Paged accumulated result has other rows than non-paged", + rows.similar(rowsPaged)); + } + + // A dummy class created, because accessing to `client()` isn't available from a static context, + // but it is needed before an instance of `PaginationBlackboxIT` is created. + private static class PaginationBlackboxHelper extends SQLIntegTestCase { + + @SneakyThrows + private List getIndices() { + initClient(); + loadIndex(Index.ACCOUNT); + loadIndex(Index.BEER); + loadIndex(Index.BANK); + if (!isIndexExist(client(), "empty")) { + executeRequest(new Request("PUT", "/empty")); + } + return Arrays.stream(getResponseBody(client().performRequest(new Request("GET", "_cat/indices?h=i")), true).split("\n")) + // exclude this index, because it is too big and extends test time too long (almost 10k docs) + .map(String::trim).filter(i -> !i.equals(TEST_INDEX_ONLINE)).collect(Collectors.toList()); + } + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationFallbackIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationFallbackIT.java new file mode 100644 index 00000000000..33d9c5f6a89 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationFallbackIT.java @@ -0,0 +1,131 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_ONLINE; +import static org.opensearch.sql.util.TestUtils.verifyIsV1Cursor; +import static org.opensearch.sql.util.TestUtils.verifyIsV2Cursor; + +import java.io.IOException; +import org.json.JSONObject; +import org.junit.Test; +import org.opensearch.sql.legacy.SQLIntegTestCase; +import org.opensearch.sql.util.TestUtils; + +public class PaginationFallbackIT extends SQLIntegTestCase { + @Override + public void init() throws IOException { + loadIndex(Index.PHRASE); + loadIndex(Index.ONLINE); + } + + @Test + public void testWhereClause() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s WHERE 1 = 1", TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testSelectAll() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s", TEST_INDEX_ONLINE); + verifyIsV2Cursor(response); + } + + @Test + public void testSelectWithOpenSearchFuncInFilter() throws IOException { + var response = executeQueryTemplate( + "SELECT * FROM %s WHERE `11` = match_phrase('96')", TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testSelectWithHighlight() throws IOException { + var response = executeQueryTemplate( + "SELECT highlight(`11`) FROM %s WHERE match_query(`11`, '96')", TEST_INDEX_ONLINE); + // As of 2023-03-08, WHERE clause sends the query to legacy engine and legacy engine + // does not support highlight as an expression. + assertTrue(response.has("error")); + } + + @Test + public void testSelectWithFullTextSearch() throws IOException { + var response = executeQueryTemplate( + "SELECT * FROM %s WHERE match_phrase(`11`, '96')", TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testSelectFromIndexWildcard() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s*", TEST_INDEX); + verifyIsV2Cursor(response); + } + + @Test + public void testSelectFromDataSource() throws IOException { + var response = executeQueryTemplate("SELECT * FROM @opensearch.%s", + TEST_INDEX_ONLINE); + verifyIsV2Cursor(response); + } + + @Test + public void testSelectColumnReference() throws IOException { + var response = executeQueryTemplate("SELECT `107` from %s", TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testSubquery() throws IOException { + var response = executeQueryTemplate("SELECT `107` from (SELECT * FROM %s)", + TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testSelectExpression() throws IOException { + var response = executeQueryTemplate("SELECT 1 + 1 - `107` from %s", + TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testGroupBy() throws IOException { + // GROUP BY is not paged by either engine. + var response = executeQueryTemplate("SELECT * FROM %s GROUP BY `107`", + TEST_INDEX_ONLINE); + TestUtils.verifyNoCursor(response); + } + + @Test + public void testGroupByHaving() throws IOException { + // GROUP BY is not paged by either engine. + var response = executeQueryTemplate("SELECT * FROM %s GROUP BY `107` HAVING `107` > 400", + TEST_INDEX_ONLINE); + TestUtils.verifyNoCursor(response); + } + + @Test + public void testLimit() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s LIMIT 8", TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testLimitOffset() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s LIMIT 8 OFFSET 4", + TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + @Test + public void testOrderBy() throws IOException { + var response = executeQueryTemplate("SELECT * FROM %s ORDER By `107`", + TEST_INDEX_ONLINE); + verifyIsV1Cursor(response); + } + + +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java new file mode 100644 index 00000000000..b9e32cb1cdd --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java @@ -0,0 +1,48 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_CALCS; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_ONLINE; + +import java.io.IOException; +import org.json.JSONObject; +import org.junit.Test; +import org.opensearch.sql.legacy.SQLIntegTestCase; +import org.opensearch.sql.util.TestUtils; + +public class PaginationIT extends SQLIntegTestCase { + @Override + public void init() throws IOException { + loadIndex(Index.CALCS); + loadIndex(Index.ONLINE); + } + + @Test + public void testSmallDataSet() throws IOException { + var query = "SELECT * from " + TEST_INDEX_CALCS; + var response = new JSONObject(executeFetchQuery(query, 4, "jdbc")); + assertTrue(response.has("cursor")); + assertEquals(4, response.getInt("size")); + TestUtils.verifyIsV2Cursor(response); + } + + @Test + public void testLargeDataSetV1() throws IOException { + var v1query = "SELECT * from " + TEST_INDEX_ONLINE + " WHERE 1 = 1"; + var v1response = new JSONObject(executeFetchQuery(v1query, 4, "jdbc")); + assertEquals(4, v1response.getInt("size")); + TestUtils.verifyIsV1Cursor(v1response); + } + + @Test + public void testLargeDataSetV2() throws IOException { + var query = "SELECT * from " + TEST_INDEX_ONLINE; + var response = new JSONObject(executeFetchQuery(query, 4, "jdbc")); + assertEquals(4, response.getInt("size")); + TestUtils.verifyIsV2Cursor(response); + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationWindowIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationWindowIT.java new file mode 100644 index 00000000000..724451ef658 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationWindowIT.java @@ -0,0 +1,98 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_PHRASE; + +import java.io.IOException; +import org.json.JSONObject; +import org.junit.After; +import org.junit.Test; +import org.opensearch.client.ResponseException; +import org.opensearch.sql.legacy.SQLIntegTestCase; + +public class PaginationWindowIT extends SQLIntegTestCase { + @Override + public void init() throws IOException { + loadIndex(Index.PHRASE); + } + + @After + void resetParams() throws IOException { + resetMaxResultWindow(TEST_INDEX_PHRASE); + resetQuerySizeLimit(); + } + + @Test + public void testFetchSizeLessThanMaxResultWindow() throws IOException { + setMaxResultWindow(TEST_INDEX_PHRASE, 6); + JSONObject response = executeQueryTemplate("SELECT * FROM %s", TEST_INDEX_PHRASE, 5); + + String cursor = ""; + int numRows = 0; + do { + // Process response + cursor = response.getString("cursor"); + numRows += response.getJSONArray("datarows").length(); + response = executeCursorQuery(cursor); + } while (response.has("cursor")); + + var countRows = executeJdbcRequest("SELECT COUNT(*) FROM " + TEST_INDEX_PHRASE) + .getJSONArray("datarows") + .getJSONArray(0) + .get(0); + assertEquals(countRows, numRows); + } + + @Test + public void testQuerySizeLimitDoesNotEffectTotalRowsReturned() throws IOException { + int querySizeLimit = 4; + setQuerySizeLimit(querySizeLimit); + JSONObject response = executeQueryTemplate("SELECT * FROM %s", TEST_INDEX_PHRASE, 5); + assertTrue(response.getInt("size") > querySizeLimit); + + String cursor = ""; + int numRows = 0; + do { + // Process response + cursor = response.getString("cursor"); + numRows += response.getJSONArray("datarows").length(); + response = executeCursorQuery(cursor); + } while (response.has("cursor")); + + var countRows = executeJdbcRequest("SELECT COUNT(*) FROM " + TEST_INDEX_PHRASE) + .getJSONArray("datarows") + .getJSONArray(0) + .get(0); + assertEquals(countRows, numRows); + assertTrue(numRows > querySizeLimit); + } + + @Test + public void testQuerySizeLimitDoesNotEffectPageSize() throws IOException { + setQuerySizeLimit(3); + setMaxResultWindow(TEST_INDEX_PHRASE, 4); + var response + = executeQueryTemplate("SELECT * FROM %s", TEST_INDEX_PHRASE, 4); + assertEquals(4, response.getInt("size")); + + var response2 + = executeQueryTemplate("SELECT * FROM %s", TEST_INDEX_PHRASE, 2); + assertEquals(2, response2.getInt("size")); + } + + @Test + public void testFetchSizeLargerThanResultWindowFails() throws IOException { + final int window = 2; + setMaxResultWindow(TEST_INDEX_PHRASE, 2); + assertThrows(ResponseException.class, + () -> executeQueryTemplate("SELECT * FROM %s", + TEST_INDEX_PHRASE, window + 1)); + resetMaxResultWindow(TEST_INDEX_PHRASE); + } + + +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java new file mode 100644 index 00000000000..8f666878216 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java @@ -0,0 +1,168 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.datasource.model.DataSourceMetadata.defaultOpenSearchDataSourceMetadata; +import static org.opensearch.sql.ppl.StandaloneIT.getDataSourceMetadataStorage; +import static org.opensearch.sql.ppl.StandaloneIT.getDataSourceUserRoleHelper; + +import com.google.common.collect.ImmutableMap; +import com.google.common.collect.ImmutableSet; +import java.io.IOException; +import java.util.List; +import java.util.Map; +import lombok.Getter; +import lombok.SneakyThrows; +import org.json.JSONObject; +import org.junit.Test; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.opensearch.client.Request; +import org.opensearch.client.ResponseException; +import org.opensearch.client.RestHighLevelClient; +import org.opensearch.common.inject.Injector; +import org.opensearch.common.inject.ModulesBuilder; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.data.type.ExprCoreType; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasource.DataSourceServiceImpl; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.legacy.SQLIntegTestCase; +import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.client.OpenSearchRestClient; +import org.opensearch.sql.executor.pagination.Cursor; +import org.opensearch.sql.opensearch.storage.OpenSearchDataSourceFactory; +import org.opensearch.sql.opensearch.storage.OpenSearchIndex; +import org.opensearch.sql.planner.PlanContext; +import org.opensearch.sql.planner.logical.LogicalPaginate; +import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.planner.logical.LogicalProject; +import org.opensearch.sql.planner.logical.LogicalRelation; +import org.opensearch.sql.planner.physical.PhysicalPlan; +import org.opensearch.sql.storage.DataSourceFactory; +import org.opensearch.sql.util.InternalRestHighLevelClient; +import org.opensearch.sql.util.StandaloneModule; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class StandalonePaginationIT extends SQLIntegTestCase { + + private QueryService queryService; + + private PaginatedPlanCache paginatedPlanCache; + + private OpenSearchClient client; + + @Override + @SneakyThrows + public void init() { + RestHighLevelClient restClient = new InternalRestHighLevelClient(client()); + client = new OpenSearchRestClient(restClient); + DataSourceService dataSourceService = new DataSourceServiceImpl( + new ImmutableSet.Builder() + .add(new OpenSearchDataSourceFactory(client, defaultSettings())) + .build(), + getDataSourceMetadataStorage(), + getDataSourceUserRoleHelper() + ); + dataSourceService.createDataSource(defaultOpenSearchDataSourceMetadata()); + + ModulesBuilder modules = new ModulesBuilder(); + modules.add(new StandaloneModule(new InternalRestHighLevelClient(client()), defaultSettings(), dataSourceService)); + Injector injector = modules.createInjector(); + + queryService = injector.getInstance(QueryService.class); + paginatedPlanCache = injector.getInstance(PaginatedPlanCache.class); + } + + @Test + public void test_pagination_whitebox() throws IOException { + class TestResponder + implements ResponseListener { + @Getter + Cursor cursor = Cursor.None; + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + cursor = response.getCursor(); + } + + @Override + public void onFailure(Exception e) { + fail(); + } + }; + + // arrange + { + Request request1 = new Request("PUT", "/test/_doc/1?refresh=true"); + request1.setJsonEntity("{\"name\": \"hello\", \"age\": 20}"); + client().performRequest(request1); + Request request2 = new Request("PUT", "/test/_doc/2?refresh=true"); + request2.setJsonEntity("{\"name\": \"world\", \"age\": 30}"); + client().performRequest(request2); + } + + // act 1, asserts in firstResponder + var t = new OpenSearchIndex(client, defaultSettings(), "test"); + LogicalPlan p = new LogicalPaginate(1, List.of( + new LogicalProject( + new LogicalRelation("test", t), List.of( + DSL.named("name", DSL.ref("name", ExprCoreType.STRING)), + DSL.named("age", DSL.ref("age", ExprCoreType.LONG))), + List.of() + ))); + var firstResponder = new TestResponder(); + queryService.executePlan(p, PlanContext.emptyPlanContext(), firstResponder); + + // act 2, asserts in secondResponder + + PhysicalPlan plan = paginatedPlanCache.convertToPlan(firstResponder.getCursor().toString()); + var secondResponder = new TestResponder(); + queryService.executePlan(plan, secondResponder); + + // act 3: confirm that there's no cursor. + } + + @Test + @SneakyThrows + public void test_explain_not_supported() { + var request = new Request("POST", "_plugins/_sql/_explain"); + // Request should be rejected before index names are resolved + request.setJsonEntity("{ \"query\": \"select * from something\", \"fetch_size\": 10 }"); + var exception = assertThrows(ResponseException.class, () -> client().performRequest(request)); + var response = new JSONObject(new String(exception.getResponse().getEntity().getContent().readAllBytes())); + assertEquals("`explain` feature for paginated requests is not implemented yet.", + response.getJSONObject("error").getString("details")); + + // Request should be rejected before cursor parsed + request.setJsonEntity("{ \"cursor\" : \"n:0000\" }"); + exception = assertThrows(ResponseException.class, () -> client().performRequest(request)); + response = new JSONObject(new String(exception.getResponse().getEntity().getContent().readAllBytes())); + assertEquals("Explain of a paged query continuation is not supported. Use `explain` for the initial query request.", + response.getJSONObject("error").getString("details")); + } + + private Settings defaultSettings() { + return new Settings() { + private final Map defaultSettings = new ImmutableMap.Builder() + .put(Key.QUERY_SIZE_LIMIT, 200) + .build(); + + @Override + public T getSettingValue(Key key) { + return (T) defaultSettings.get(key); + } + + @Override + public List getSettings() { + return (List) defaultSettings; + } + }; + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/util/InternalRestHighLevelClient.java b/integ-test/src/test/java/org/opensearch/sql/util/InternalRestHighLevelClient.java new file mode 100644 index 00000000000..57726089ae7 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/util/InternalRestHighLevelClient.java @@ -0,0 +1,19 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.util; + +import java.util.Collections; +import org.opensearch.client.RestClient; +import org.opensearch.client.RestHighLevelClient; + +/** + * Internal RestHighLevelClient only for testing purpose. + */ +public class InternalRestHighLevelClient extends RestHighLevelClient { + public InternalRestHighLevelClient(RestClient restClient) { + super(restClient, RestClient::close, Collections.emptyList()); + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java new file mode 100644 index 00000000000..c7515b461f8 --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java @@ -0,0 +1,123 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.util; + +import lombok.RequiredArgsConstructor; +import org.opensearch.client.RestHighLevelClient; +import org.opensearch.common.inject.AbstractModule; +import org.opensearch.common.inject.Provides; +import org.opensearch.common.inject.Singleton; +import org.opensearch.sql.analysis.Analyzer; +import org.opensearch.sql.analysis.ExpressionAnalyzer; +import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.QueryManager; +import org.opensearch.sql.executor.QueryService; +import org.opensearch.sql.executor.execution.QueryPlanFactory; +import org.opensearch.sql.expression.function.BuiltinFunctionRepository; +import org.opensearch.sql.monitor.AlwaysHealthyMonitor; +import org.opensearch.sql.monitor.ResourceMonitor; +import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.client.OpenSearchRestClient; +import org.opensearch.sql.opensearch.executor.OpenSearchExecutionEngine; +import org.opensearch.sql.opensearch.executor.protector.ExecutionProtector; +import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; +import org.opensearch.sql.opensearch.storage.OpenSearchStorageEngine; +import org.opensearch.sql.planner.Planner; +import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; +import org.opensearch.sql.ppl.PPLService; +import org.opensearch.sql.ppl.antlr.PPLSyntaxParser; +import org.opensearch.sql.sql.SQLService; +import org.opensearch.sql.sql.antlr.SQLSyntaxParser; +import org.opensearch.sql.storage.StorageEngine; + +/** + * A utility class which registers SQL engine singletons as `OpenSearchPluginModule` does. + * It is needed to get access to those instances in test and validate their behavior. + */ +@RequiredArgsConstructor +public class StandaloneModule extends AbstractModule { + + private final RestHighLevelClient client; + + private final Settings settings; + + private final DataSourceService dataSourceService; + + private final BuiltinFunctionRepository functionRepository = + BuiltinFunctionRepository.getInstance(); + + @Override + protected void configure() { + } + + @Provides + public OpenSearchClient openSearchClient() { + return new OpenSearchRestClient(client); + } + + @Provides + public StorageEngine storageEngine(OpenSearchClient client) { + return new OpenSearchStorageEngine(client, settings); + } + + @Provides + public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, + PaginatedPlanCache paginatedPlanCache) { + return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); + } + + @Provides + public ResourceMonitor resourceMonitor() { + return new AlwaysHealthyMonitor(); + } + + @Provides + public ExecutionProtector protector(ResourceMonitor resourceMonitor) { + return new OpenSearchExecutionProtector(resourceMonitor); + } + + @Provides + @Singleton + public QueryManager queryManager() { + return new ExecuteOnCallerThreadQueryManager(); + } + + @Provides + public PPLService pplService(QueryManager queryManager, QueryPlanFactory queryPlanFactory) { + return new PPLService(new PPLSyntaxParser(), queryManager, queryPlanFactory); + } + + @Provides + public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPlanFactory) { + return new SQLService(new SQLSyntaxParser(), queryManager, queryPlanFactory); + } + + @Provides + public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { + return new PaginatedPlanCache(storageEngine); + } + + @Provides + public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine, + PaginatedPlanCache paginatedPlanCache, + QueryService qs) { + + return new QueryPlanFactory(qs, paginatedPlanCache); + } + + @Provides + public QueryService queryService(ExecutionEngine executionEngine) { + Analyzer analyzer = + new Analyzer( + new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); + Planner planner = new Planner(LogicalPlanOptimizer.create()); + Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); + return new QueryService(analyzer, executionEngine, planner, paginationPlanner); + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java b/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java index bd75ead43b6..80ce24ecacf 100644 --- a/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java +++ b/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java @@ -7,6 +7,8 @@ package org.opensearch.sql.util; import static com.google.common.base.Strings.isNullOrEmpty; +import static org.junit.Assert.assertTrue; +import static org.opensearch.sql.executor.pagination.PaginatedPlanCache.CURSOR_PREFIX; import java.io.BufferedReader; import java.io.File; @@ -20,22 +22,21 @@ import java.nio.file.Path; import java.nio.file.Paths; import java.util.ArrayList; +import java.util.Arrays; import java.util.LinkedList; import java.util.List; import java.util.Locale; import java.util.stream.Collectors; import org.json.JSONObject; -import org.junit.Assert; import org.opensearch.action.bulk.BulkRequest; import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.index.IndexRequest; import org.opensearch.client.Client; import org.opensearch.client.Request; -import org.opensearch.client.RequestOptions; import org.opensearch.client.Response; import org.opensearch.client.RestClient; import org.opensearch.common.xcontent.XContentType; -import org.opensearch.rest.RestStatus; +import org.opensearch.sql.legacy.cursor.CursorType; public class TestUtils { @@ -839,4 +840,28 @@ public static List> getPermutations(final List items) { return result; } + + public static void verifyIsV1Cursor(JSONObject response) { + var legacyCursorPrefixes = Arrays.stream(CursorType.values()) + .map(c -> c.getId() + ":").collect(Collectors.toList()); + verifyCursor(response, legacyCursorPrefixes, "v1"); + } + + + public static void verifyIsV2Cursor(JSONObject response) { + verifyCursor(response, List.of(CURSOR_PREFIX), "v2"); + } + + private static void verifyCursor(JSONObject response, List validCursorPrefix, String engineName) { + assertTrue("'cursor' property does not exist", response.has("cursor")); + + var cursor = response.getString("cursor"); + assertTrue("'cursor' property is empty", !cursor.isEmpty()); + assertTrue("The cursor '" + cursor + "' is not from " + engineName + " engine.", + validCursorPrefix.stream().anyMatch(cursor::startsWith)); + } + + public static void verifyNoCursor(JSONObject response) { + assertTrue(!response.has("cursor")); + } } diff --git a/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSQLQueryAction.java b/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSQLQueryAction.java index bc97f71b476..cbbc8c7b9cb 100644 --- a/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSQLQueryAction.java +++ b/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSQLQueryAction.java @@ -24,6 +24,7 @@ import org.opensearch.sql.common.antlr.SyntaxCheckException; import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.common.utils.QueryContext; +import org.opensearch.sql.exception.UnsupportedCursorRequestException; import org.opensearch.sql.executor.ExecutionEngine.ExplainResponse; import org.opensearch.sql.legacy.metrics.MetricName; import org.opensearch.sql.legacy.metrics.Metrics; @@ -119,14 +120,14 @@ private ResponseListener fallBackListener( return new ResponseListener() { @Override public void onResponse(T response) { - LOG.error("[{}] Request is handled by new SQL query engine", + LOG.info("[{}] Request is handled by new SQL query engine", QueryContext.getRequestId()); next.onResponse(response); } @Override public void onFailure(Exception e) { - if (e instanceof SyntaxCheckException) { + if (e instanceof SyntaxCheckException || e instanceof UnsupportedCursorRequestException) { fallBackHandler.accept(channel, e); } else { next.onFailure(e); @@ -172,7 +173,8 @@ private ResponseListener createQueryResponseListener( @Override public void onResponse(QueryResponse response) { sendResponse(channel, OK, - formatter.format(new QueryResult(response.getSchema(), response.getResults()))); + formatter.format(new QueryResult(response.getSchema(), response.getResults(), + response.getCursor(), response.getTotal()))); } @Override diff --git a/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSqlAction.java b/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSqlAction.java index 88ed42010bd..e1c72f0f1ef 100644 --- a/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSqlAction.java +++ b/legacy/src/main/java/org/opensearch/sql/legacy/plugin/RestSqlAction.java @@ -42,6 +42,7 @@ import org.opensearch.sql.legacy.antlr.SqlAnalysisConfig; import org.opensearch.sql.legacy.antlr.SqlAnalysisException; import org.opensearch.sql.legacy.antlr.semantic.types.Type; +import org.opensearch.sql.legacy.cursor.CursorType; import org.opensearch.sql.legacy.domain.ColumnTypeProvider; import org.opensearch.sql.legacy.domain.QueryActionRequest; import org.opensearch.sql.legacy.esdomain.LocalClusterState; @@ -132,7 +133,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli } final SqlRequest sqlRequest = SqlRequestFactory.getSqlRequest(request); - if (sqlRequest.cursor() != null) { + if (isLegacyCursor(sqlRequest)) { if (isExplainRequest(request)) { throw new IllegalArgumentException("Invalid request. Cannot explain cursor"); } else { @@ -148,14 +149,14 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli // Route request to new query engine if it's supported already SQLQueryRequest newSqlRequest = new SQLQueryRequest(sqlRequest.getJsonContent(), - sqlRequest.getSql(), request.path(), request.params()); + sqlRequest.getSql(), request.path(), request.params(), sqlRequest.cursor()); return newSqlQueryHandler.prepareRequest(newSqlRequest, (restChannel, exception) -> { try{ if (newSqlRequest.isExplainRequest()) { LOG.info("Request is falling back to old SQL engine due to: " + exception.getMessage()); } - LOG.debug("[{}] Request {} is not supported and falling back to old SQL engine", + LOG.info("[{}] Request {} is not supported and falling back to old SQL engine", QueryContext.getRequestId(), newSqlRequest); QueryAction queryAction = explainRequest(client, sqlRequest, format); executeSqlRequest(request, queryAction, client, restChannel); @@ -175,6 +176,17 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli } } + + /** + * @param sqlRequest client request + * @return true if this cursor was generated by the legacy engine, false otherwise. + */ + private static boolean isLegacyCursor(SqlRequest sqlRequest) { + String cursor = sqlRequest.cursor(); + return cursor != null + && CursorType.getById(cursor.substring(0, 1)) != CursorType.NULL; + } + @Override protected Set responseParams() { Set responseParams = new HashSet<>(super.responseParams()); diff --git a/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionCursorFallbackTest.java b/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionCursorFallbackTest.java new file mode 100644 index 00000000000..a11f4c47d7a --- /dev/null +++ b/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionCursorFallbackTest.java @@ -0,0 +1,127 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.legacy.plugin; + +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertTrue; +import static org.junit.Assert.fail; +import static org.opensearch.sql.legacy.plugin.RestSqlAction.QUERY_API_ENDPOINT; + +import java.io.IOException; +import java.util.Map; +import java.util.Optional; +import java.util.concurrent.atomic.AtomicBoolean; +import org.json.JSONObject; +import org.junit.Before; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.MockitoJUnitRunner; +import org.opensearch.client.node.NodeClient; +import org.opensearch.common.Strings; +import org.opensearch.common.inject.Injector; +import org.opensearch.common.inject.ModulesBuilder; +import org.opensearch.common.util.concurrent.ThreadContext; +import org.opensearch.common.xcontent.XContentFactory; +import org.opensearch.rest.BaseRestHandler; +import org.opensearch.rest.RestChannel; +import org.opensearch.rest.RestRequest; +import org.opensearch.sql.common.antlr.SyntaxCheckException; +import org.opensearch.sql.executor.QueryManager; +import org.opensearch.sql.executor.execution.QueryPlanFactory; +import org.opensearch.sql.sql.SQLService; +import org.opensearch.sql.sql.antlr.SQLSyntaxParser; +import org.opensearch.sql.sql.domain.SQLQueryRequest; +import org.opensearch.threadpool.ThreadPool; + +/** + * A test suite that verifies fallback behaviour of cursor queries. + */ +@RunWith(MockitoJUnitRunner.class) +public class RestSQLQueryActionCursorFallbackTest extends BaseRestHandler { + + private NodeClient nodeClient; + + @Mock + private ThreadPool threadPool; + + @Mock + private QueryManager queryManager; + + @Mock + private QueryPlanFactory factory; + + @Mock + private RestChannel restChannel; + + private Injector injector; + + @Before + public void setup() { + nodeClient = new NodeClient(org.opensearch.common.settings.Settings.EMPTY, threadPool); + ModulesBuilder modules = new ModulesBuilder(); + modules.add(b -> { + b.bind(SQLService.class).toInstance(new SQLService(new SQLSyntaxParser(), queryManager, factory)); + }); + injector = modules.createInjector(); + Mockito.lenient().when(threadPool.getThreadContext()) + .thenReturn(new ThreadContext(org.opensearch.common.settings.Settings.EMPTY)); + } + + // Initial page request test cases + + @Test + public void no_fallback_with_column_reference() throws Exception { + String query = "SELECT name FROM test1"; + SQLQueryRequest request = createSqlQueryRequest(query, Optional.empty(), + Optional.of(5)); + + assertFalse(doesQueryFallback(request)); + } + + private static SQLQueryRequest createSqlQueryRequest(String query, Optional cursorId, + Optional fetchSize) throws IOException { + var builder = XContentFactory.jsonBuilder() + .startObject() + .field("query").value(query); + if (cursorId.isPresent()) { + builder.field("cursor").value(cursorId.get()); + } + + if (fetchSize.isPresent()) { + builder.field("fetch_size").value(fetchSize.get()); + } + builder.endObject(); + JSONObject jsonContent = new JSONObject(Strings.toString(builder)); + + return new SQLQueryRequest(jsonContent, query, QUERY_API_ENDPOINT, + Map.of("format", "jdbc"), cursorId.orElse("")); + } + + boolean doesQueryFallback(SQLQueryRequest request) throws Exception { + AtomicBoolean fallback = new AtomicBoolean(false); + RestSQLQueryAction queryAction = new RestSQLQueryAction(injector); + queryAction.prepareRequest(request, (channel, exception) -> { + fallback.set(true); + }, (channel, exception) -> { + }).accept(restChannel); + return fallback.get(); + } + + @Override + public String getName() { + // do nothing, RestChannelConsumer is protected which required to extend BaseRestHandler + return null; + } + + @Override + protected BaseRestHandler.RestChannelConsumer prepareRequest(RestRequest restRequest, NodeClient nodeClient) + { + // do nothing, RestChannelConsumer is protected which required to extend BaseRestHandler + return null; + } +} diff --git a/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionTest.java b/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionTest.java index 1bc34edf50a..be572f3dfbe 100644 --- a/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionTest.java +++ b/legacy/src/test/java/org/opensearch/sql/legacy/plugin/RestSQLQueryActionTest.java @@ -74,7 +74,7 @@ public void handleQueryThatCanSupport() throws Exception { new JSONObject("{\"query\": \"SELECT -123\"}"), "SELECT -123", QUERY_API_ENDPOINT, - ""); + "jdbc"); RestSQLQueryAction queryAction = new RestSQLQueryAction(injector); queryAction.prepareRequest(request, (channel, exception) -> { @@ -90,7 +90,7 @@ public void handleExplainThatCanSupport() throws Exception { new JSONObject("{\"query\": \"SELECT -123\"}"), "SELECT -123", EXPLAIN_API_ENDPOINT, - ""); + "jdbc"); RestSQLQueryAction queryAction = new RestSQLQueryAction(injector); queryAction.prepareRequest(request, (channel, exception) -> { @@ -107,7 +107,7 @@ public void queryThatNotSupportIsHandledByFallbackHandler() throws Exception { "{\"query\": \"SELECT name FROM test1 JOIN test2 ON test1.name = test2.name\"}"), "SELECT name FROM test1 JOIN test2 ON test1.name = test2.name", QUERY_API_ENDPOINT, - ""); + "jdbc"); AtomicBoolean fallback = new AtomicBoolean(false); RestSQLQueryAction queryAction = new RestSQLQueryAction(injector); @@ -128,7 +128,7 @@ public void queryExecutionFailedIsHandledByExecutionErrorHandler() throws Except "{\"query\": \"SELECT -123\"}"), "SELECT -123", QUERY_API_ENDPOINT, - ""); + "jdbc"); doThrow(new IllegalStateException("execution exception")) .when(queryManager) diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClient.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClient.java index 8818c394a17..e4f25dabbdc 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClient.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClient.java @@ -43,7 +43,7 @@ public class OpenSearchNodeClient implements OpenSearchClient { private final NodeClient client; /** - * Constructor of ElasticsearchNodeClient. + * Constructor of OpenSearchNodeClient. */ public OpenSearchNodeClient(NodeClient client) { this.client = client; @@ -172,7 +172,14 @@ public Map meta() { @Override public void cleanup(OpenSearchRequest request) { - request.clean(scrollId -> client.prepareClearScroll().addScrollId(scrollId).get()); + request.clean(scrollId -> { + try { + client.prepareClearScroll().addScrollId(scrollId).get(); + } catch (Exception e) { + throw new IllegalStateException( + "Failed to clean up resources for search request " + request, e); + } + }); } @Override diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchRestClient.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchRestClient.java index d9f9dbbe5d5..757ea99c1b0 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchRestClient.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/client/OpenSearchRestClient.java @@ -184,7 +184,6 @@ public void cleanup(OpenSearchRequest request) { "Failed to clean up resources for search request " + request, e); } }); - } @Override diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java index 9a136a3bec9..103e15e6cdf 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java @@ -15,6 +15,7 @@ import org.opensearch.sql.executor.ExecutionContext; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.Explain; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.executor.protector.ExecutionProtector; import org.opensearch.sql.planner.physical.PhysicalPlan; @@ -27,6 +28,7 @@ public class OpenSearchExecutionEngine implements ExecutionEngine { private final OpenSearchClient client; private final ExecutionProtector executionProtector; + private final PaginatedPlanCache paginatedPlanCache; @Override public void execute(PhysicalPlan physicalPlan, ResponseListener listener) { @@ -49,7 +51,8 @@ public void execute(PhysicalPlan physicalPlan, ExecutionContext context, result.add(plan.next()); } - QueryResponse response = new QueryResponse(physicalPlan.schema(), result); + QueryResponse response = new QueryResponse(physicalPlan.schema(), result, + plan.getTotalHits(), paginatedPlanCache.convertToCursor(plan)); listener.onResponse(response); } catch (Exception e) { listener.onFailure(e); diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java index f06ecb85768..4d6925f1aaf 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java @@ -16,6 +16,7 @@ import org.opensearch.sql.planner.physical.EvalOperator; import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; +import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; import org.opensearch.sql.planner.physical.RareTopNOperator; @@ -63,6 +64,12 @@ public PhysicalPlan visitRename(RenameOperator node, Object context) { return new RenameOperator(visitInput(node.getInput(), context), node.getMapping()); } + @Override + public PhysicalPlan visitPaginate(PaginateOperator node, Object context) { + return new PaginateOperator(visitInput(node.getInput(), context), node.getPageSize(), + node.getPageIndex()); + } + /** * Decorate with {@link ResourceMonitorPlan}. */ diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java index 9c59e4acaf8..3d880d82b9f 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java @@ -82,4 +82,14 @@ public ExprValue next() { } return delegate.next(); } + + @Override + public long getTotalHits() { + return delegate.getTotalHits(); + } + + @Override + public String toCursor() { + return delegate.toCursor(); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java new file mode 100644 index 00000000000..6c81b9aca24 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java @@ -0,0 +1,80 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import static org.opensearch.sql.opensearch.request.OpenSearchScrollRequest.DEFAULT_SCROLL_TIMEOUT; + +import java.util.function.Consumer; +import java.util.function.Function; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.ToString; +import org.opensearch.action.search.SearchRequest; +import org.opensearch.action.search.SearchResponse; +import org.opensearch.action.search.SearchScrollRequest; +import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; + +/** + * Scroll (cursor) request is used to page the search. This request is not configurable and has + * no search query. It just handles paging through responses to the initial request. + * It is used on second and next pagination (cursor) requests. + * First (initial) request is handled by {@link InitialPageRequestBuilder}. + */ +@EqualsAndHashCode +public class ContinuePageRequest implements OpenSearchRequest { + final String initialScrollId; + + // ScrollId that OpenSearch returns after search. + String responseScrollId; + + @EqualsAndHashCode.Exclude + @ToString.Exclude + @Getter + private final OpenSearchExprValueFactory exprValueFactory; + + @EqualsAndHashCode.Exclude + private boolean scrollFinished = false; + + public ContinuePageRequest(String scrollId, OpenSearchExprValueFactory exprValueFactory) { + this.initialScrollId = scrollId; + this.exprValueFactory = exprValueFactory; + } + + @Override + public OpenSearchResponse search(Function searchAction, + Function scrollAction) { + SearchResponse openSearchResponse = scrollAction.apply(new SearchScrollRequest(initialScrollId) + .scroll(DEFAULT_SCROLL_TIMEOUT)); + + // TODO if terminated_early - something went wrong, e.g. no scroll returned. + var response = new OpenSearchResponse(openSearchResponse, exprValueFactory); + // on the last empty page, we should close the scroll + scrollFinished = response.isEmpty(); + responseScrollId = openSearchResponse.getScrollId(); + return response; + } + + @Override + public void clean(Consumer cleanAction) { + if (scrollFinished) { + cleanAction.accept(responseScrollId); + } + } + + @Override + public SearchSourceBuilder getSourceBuilder() { + throw new UnsupportedOperationException( + "SearchSourceBuilder is unavailable for ContinueScrollRequest"); + } + + @Override + public String toCursor() { + // on the last page, we shouldn't return the scroll to user, it is kept for closing (clean) + return scrollFinished ? null : responseScrollId; + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java new file mode 100644 index 00000000000..78288c12423 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java @@ -0,0 +1,28 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; + +/** + * Builds a {@link ContinuePageRequest} to handle subsequent pagination/scroll/cursor requests. + * Initial search requests is handled by {@link InitialPageRequestBuilder}. + */ +@RequiredArgsConstructor +public class ContinuePageRequestBuilder extends PagedRequestBuilder { + + @Getter + private final OpenSearchRequest.IndexName indexName; + private final String scrollId; + private final OpenSearchExprValueFactory exprValueFactory; + + @Override + public OpenSearchRequest build() { + return new ContinuePageRequest(scrollId, exprValueFactory); + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java new file mode 100644 index 00000000000..dee009ee974 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java @@ -0,0 +1,67 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; + +import java.util.Map; +import java.util.Set; +import lombok.Getter; +import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.sql.data.type.ExprType; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; + +/** + * This builder assists creating the initial OpenSearch paging (scrolling) request. + * It is used only on the first page (pagination request). + * Subsequent requests (cursor requests) use {@link ContinuePageRequestBuilder}. + */ +public class InitialPageRequestBuilder extends PagedRequestBuilder { + + @Getter + private final OpenSearchRequest.IndexName indexName; + private final SearchSourceBuilder sourceBuilder; + private final OpenSearchExprValueFactory exprValueFactory; + + /** + * Constructor. + * + * @param indexName index being scanned + * @param exprValueFactory value factory + */ + // TODO accept indexName as string (same way as `OpenSearchRequestBuilder` does)? + public InitialPageRequestBuilder(OpenSearchRequest.IndexName indexName, + int pageSize, + OpenSearchExprValueFactory exprValueFactory) { + this.indexName = indexName; + this.exprValueFactory = exprValueFactory; + this.sourceBuilder = new SearchSourceBuilder() + .from(0) + .size(pageSize) + .timeout(DEFAULT_QUERY_TIMEOUT); + } + + @Override + public OpenSearchScrollRequest build() { + return new OpenSearchScrollRequest(indexName, sourceBuilder, exprValueFactory); + } + + /** + * Push down project expression to OpenSearch. + */ + @Override + public void pushDownProjects(Set projects) { + sourceBuilder.fetchSource(projects.stream().map(ReferenceExpression::getAttr) + .distinct().toArray(String[]::new), new String[0]); + } + + @Override + public void pushTypeMapping(Map typeMapping) { + exprValueFactory.extendTypeMapping(typeMapping); + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java index 6f6fea841b6..0795ce7cdc7 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java @@ -6,6 +6,8 @@ package org.opensearch.sql.opensearch.request; +import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; + import com.google.common.annotations.VisibleForTesting; import java.util.function.Consumer; import java.util.function.Function; @@ -15,7 +17,6 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchScrollRequest; -import org.opensearch.common.unit.TimeValue; import org.opensearch.search.SearchHits; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @@ -32,11 +33,6 @@ @ToString public class OpenSearchQueryRequest implements OpenSearchRequest { - /** - * Default query timeout in minutes. - */ - public static final TimeValue DEFAULT_QUERY_TIMEOUT = TimeValue.timeValueMinutes(1L); - /** * {@link OpenSearchRequest.IndexName}. */ diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequest.java index ce990780c1c..c5b6d60af36 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequest.java @@ -50,9 +50,13 @@ OpenSearchResponse search(Function searchAction, */ OpenSearchExprValueFactory getExprValueFactory(); + default String toCursor() { + return ""; + } + /** * OpenSearch Index Name. - * Indices are seperated by ",". + * Indices are separated by ",". */ @EqualsAndHashCode class IndexName { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java index 97aeee37472..531710d5458 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java @@ -9,7 +9,6 @@ import static org.opensearch.search.sort.FieldSortBuilder.DOC_FIELD_NAME; import static org.opensearch.search.sort.SortOrder.ASC; -import com.google.common.collect.Lists; import java.util.Arrays; import java.util.List; import java.util.Map; @@ -26,7 +25,6 @@ import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.fetch.subphase.highlight.HighlightBuilder; -import org.opensearch.search.sort.FieldSortBuilder; import org.opensearch.search.sort.SortBuilder; import org.opensearch.search.sort.SortBuilders; import org.opensearch.sql.ast.expression.Literal; @@ -41,10 +39,10 @@ /** * OpenSearch search request builder. */ -@EqualsAndHashCode +@EqualsAndHashCode(callSuper = false) @Getter @ToString -public class OpenSearchRequestBuilder { +public class OpenSearchRequestBuilder implements PushDownRequestBuilder { /** * Default query timeout in minutes. @@ -74,15 +72,16 @@ public class OpenSearchRequestBuilder { private final OpenSearchExprValueFactory exprValueFactory; /** - * Query size of the request. + * Query size of the request -- how many rows will be returned. */ - private Integer querySize; + private int querySize; public OpenSearchRequestBuilder(String indexName, Integer maxResultWindow, Settings settings, OpenSearchExprValueFactory exprValueFactory) { - this(new OpenSearchRequest.IndexName(indexName), maxResultWindow, settings, exprValueFactory); + this(new OpenSearchRequest.IndexName(indexName), maxResultWindow, settings, + exprValueFactory); } /** @@ -111,11 +110,11 @@ public OpenSearchRequest build() { Integer from = sourceBuilder.from(); Integer size = sourceBuilder.size(); - if (from + size <= maxResultWindow) { - return new OpenSearchQueryRequest(indexName, sourceBuilder, exprValueFactory); - } else { + if (from + size > maxResultWindow) { sourceBuilder.size(maxResultWindow - from); return new OpenSearchScrollRequest(indexName, sourceBuilder, exprValueFactory); + } else { + return new OpenSearchQueryRequest(indexName, sourceBuilder, exprValueFactory); } } @@ -124,7 +123,8 @@ public OpenSearchRequest build() { * * @param query query request */ - public void pushDown(QueryBuilder query) { + @Override + public void pushDownFilter(QueryBuilder query) { QueryBuilder current = sourceBuilder.query(); if (current == null) { @@ -149,6 +149,7 @@ public void pushDown(QueryBuilder query) { * * @param aggregationBuilder pair of aggregation query and aggregation parser. */ + @Override public void pushDownAggregation( Pair, OpenSearchAggregationResponseParser> aggregationBuilder) { aggregationBuilder.getLeft().forEach(builder -> sourceBuilder.aggregation(builder)); @@ -161,6 +162,7 @@ public void pushDownAggregation( * * @param sortBuilders sortBuilders. */ + @Override public void pushDownSort(List> sortBuilders) { // TODO: Sort by _doc is added when filter push down. Remove both logic once doctest fixed. if (isSortByDocOnly()) { @@ -175,6 +177,7 @@ public void pushDownSort(List> sortBuilders) { /** * Push down size (limit) and from (offset) to DSL request. */ + @Override public void pushDownLimit(Integer limit, Integer offset) { querySize = limit; sourceBuilder.from(offset).size(limit); @@ -184,6 +187,7 @@ public void pushDownLimit(Integer limit, Integer offset) { * Add highlight to DSL requests. * @param field name of the field to highlight */ + @Override public void pushDownHighlight(String field, Map arguments) { String unquotedField = StringUtils.unquoteText(field); if (sourceBuilder.highlighter() != null) { @@ -214,22 +218,20 @@ public void pushDownHighlight(String field, Map arguments) { } /** - * Push down project list to DSL requets. + * Push down project list to DSL requests. */ + @Override public void pushDownProjects(Set projects) { final Set projectsSet = projects.stream().map(ReferenceExpression::getAttr).collect(Collectors.toSet()); sourceBuilder.fetchSource(projectsSet.toArray(new String[0]), new String[0]); } + @Override public void pushTypeMapping(Map typeMapping) { exprValueFactory.extendTypeMapping(typeMapping); } - private boolean isBoolFilterQuery(QueryBuilder current) { - return (current instanceof BoolQueryBuilder); - } - private boolean isSortByDocOnly() { List> sorts = sourceBuilder.sorts(); if (sorts != null) { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java index 4509e443c0a..8dceee99ee8 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java @@ -11,7 +11,6 @@ import java.util.function.Function; import lombok.EqualsAndHashCode; import lombok.Getter; -import lombok.RequiredArgsConstructor; import lombok.Setter; import lombok.ToString; import org.opensearch.action.search.SearchRequest; @@ -34,7 +33,7 @@ public class OpenSearchScrollRequest implements OpenSearchRequest { /** Default scroll context timeout in minutes. */ - public static final TimeValue DEFAULT_SCROLL_TIMEOUT = TimeValue.timeValueMinutes(1L); + public static final TimeValue DEFAULT_SCROLL_TIMEOUT = TimeValue.timeValueMinutes(100L); /** * {@link OpenSearchRequest.IndexName}. @@ -51,8 +50,11 @@ public class OpenSearchScrollRequest implements OpenSearchRequest { * multi-thread so this state has to be maintained here. */ @Setter + @Getter private String scrollId; + private boolean needClean = false; + /** Search request source builder. */ private final SearchSourceBuilder sourceBuilder; @@ -81,21 +83,26 @@ public OpenSearchScrollRequest(IndexName indexName, public OpenSearchResponse search(Function searchAction, Function scrollAction) { SearchResponse openSearchResponse; - if (isScrollStarted()) { + if (isScroll()) { openSearchResponse = scrollAction.apply(scrollRequest()); } else { openSearchResponse = searchAction.apply(searchRequest()); } - setScrollId(openSearchResponse.getScrollId()); - return new OpenSearchResponse(openSearchResponse, exprValueFactory); + var response = new OpenSearchResponse(openSearchResponse, exprValueFactory); + if (!(needClean = response.isEmpty())) { + setScrollId(openSearchResponse.getScrollId()); + } + return response; } @Override public void clean(Consumer cleanAction) { try { - if (isScrollStarted()) { + // clean on the last page only, to prevent closing the scroll/cursor in the middle of paging. + if (needClean && isScroll()) { cleanAction.accept(getScrollId()); + setScrollId(null); } } finally { reset(); @@ -119,8 +126,8 @@ public SearchRequest searchRequest() { * * @return true if scroll started */ - public boolean isScrollStarted() { - return (scrollId != null); + public boolean isScroll() { + return scrollId != null; } /** @@ -140,4 +147,13 @@ public SearchScrollRequest scrollRequest() { public void reset() { scrollId = null; } + + /** + * Convert a scroll request to string that can be included in a cursor. + * @return a string representing the scroll request. + */ + @Override + public String toCursor() { + return scrollId; + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PagedRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PagedRequestBuilder.java new file mode 100644 index 00000000000..69309bd7c9b --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PagedRequestBuilder.java @@ -0,0 +1,12 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +public abstract class PagedRequestBuilder implements PushDownRequestBuilder { + public abstract OpenSearchRequest build(); + + public abstract OpenSearchRequest.IndexName getIndexName(); +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java new file mode 100644 index 00000000000..ab1805ce4e8 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java @@ -0,0 +1,63 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import java.util.List; +import java.util.Map; +import java.util.Set; +import lombok.Getter; +import org.apache.commons.lang3.tuple.Pair; +import org.opensearch.index.query.BoolQueryBuilder; +import org.opensearch.index.query.QueryBuilder; +import org.opensearch.search.aggregations.AggregationBuilder; +import org.opensearch.search.sort.SortBuilder; +import org.opensearch.sql.ast.expression.Literal; +import org.opensearch.sql.data.type.ExprType; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; + +public interface PushDownRequestBuilder { + + default boolean isBoolFilterQuery(QueryBuilder current) { + return (current instanceof BoolQueryBuilder); + } + + private String throwUnsupported(String operation) { + return String.format("%s: push down %s in cursor requests is not supported", + getClass().getSimpleName(), operation); + } + + default void pushDownFilter(QueryBuilder query) { + throw new UnsupportedOperationException(throwUnsupported("filter")); + } + + default void pushDownAggregation( + Pair, OpenSearchAggregationResponseParser> aggregationBuilder) { + throw new UnsupportedOperationException(throwUnsupported("aggregation")); + } + + default void pushDownSort(List> sortBuilders) { + throw new UnsupportedOperationException(throwUnsupported("sort")); + } + + default void pushDownLimit(Integer limit, Integer offset) { + throw new UnsupportedOperationException(throwUnsupported("limit")); + } + + default void pushDownHighlight(String field, Map arguments) { + throw new UnsupportedOperationException(throwUnsupported("highlight")); + } + + default void pushDownProjects(Set projects) { + throw new UnsupportedOperationException(throwUnsupported("projects")); + } + + default void pushTypeMapping(Map typeMapping) { + throw new UnsupportedOperationException(throwUnsupported("type mapping")); + } +} \ No newline at end of file diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java index aadd73efdde..61d4459a862 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java @@ -39,13 +39,13 @@ public class OpenSearchResponse implements Iterable { private final Aggregations aggregations; /** - * ElasticsearchExprValueFactory used to build ExprValue from search result. + * OpenSearchExprValueFactory used to build ExprValue from search result. */ @EqualsAndHashCode.Exclude private final OpenSearchExprValueFactory exprValueFactory; /** - * Constructor of ElasticsearchResponse. + * Constructor of OpenSearchResponse. */ public OpenSearchResponse(SearchResponse searchResponse, OpenSearchExprValueFactory exprValueFactory) { @@ -55,7 +55,7 @@ public OpenSearchResponse(SearchResponse searchResponse, } /** - * Constructor of ElasticsearchResponse with SearchHits. + * Constructor of OpenSearchResponse with SearchHits. */ public OpenSearchResponse(SearchHits hits, OpenSearchExprValueFactory exprValueFactory) { this.hits = hits; @@ -73,6 +73,10 @@ public boolean isEmpty() { return (hits.getHits() == null) || (hits.getHits().length == 0) && aggregations == null; } + public long getTotalHits() { + return hits.getTotalHits().value; + } + public boolean isAggregationResponse() { return aggregations != null; } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java index 9ed8adf3eeb..288bb6006a6 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java @@ -19,9 +19,14 @@ import org.opensearch.sql.opensearch.planner.physical.ADOperator; import org.opensearch.sql.opensearch.planner.physical.MLCommonsOperator; import org.opensearch.sql.opensearch.planner.physical.MLOperator; +import org.opensearch.sql.opensearch.request.InitialPageRequestBuilder; import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.request.system.OpenSearchDescribeIndexRequest; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScanBuilder; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScanBuilder; import org.opensearch.sql.planner.DefaultImplementor; import org.opensearch.sql.planner.logical.LogicalAD; import org.opensearch.sql.planner.logical.LogicalML; @@ -151,11 +156,21 @@ public LogicalPlan optimize(LogicalPlan plan) { @Override public TableScanBuilder createScanBuilder() { - OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, indexName, - getMaxResultWindow(), new OpenSearchExprValueFactory(getFieldOpenSearchTypes())); + var requestBuilder = new OpenSearchRequestBuilder(indexName, getMaxResultWindow(), + settings, new OpenSearchExprValueFactory(getFieldOpenSearchTypes())); + OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, requestBuilder); + return new OpenSearchIndexScanBuilder(indexScan); } + @Override + public TableScanBuilder createPagedScanBuilder(int pageSize) { + var requestBuilder = new InitialPageRequestBuilder(indexName, pageSize, + new OpenSearchExprValueFactory(getFieldOpenSearchTypes())); + var indexScan = new OpenSearchPagedIndexScan(client, requestBuilder); + return new OpenSearchPagedIndexScanBuilder(indexScan); + } + @VisibleForTesting @RequiredArgsConstructor public static class OpenSearchDefaultImplementor diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java index 4a3393abc94..a5f5f372ada 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java @@ -12,9 +12,14 @@ import org.opensearch.sql.DataSourceSchemaName; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.request.ContinuePageRequestBuilder; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; import org.opensearch.sql.opensearch.storage.system.OpenSearchSystemIndex; import org.opensearch.sql.storage.StorageEngine; import org.opensearch.sql.storage.Table; +import org.opensearch.sql.storage.TableScanOperator; /** OpenSearch storage engine implementation. */ @RequiredArgsConstructor @@ -33,4 +38,15 @@ public Table getTable(DataSourceSchemaName dataSourceSchemaName, String name) { return new OpenSearchIndex(client, settings, name); } } + + @Override + public TableScanOperator getTableScan(String indexName, String scrollId) { + // TODO call `getTable` here? + var index = new OpenSearchIndex(client, settings, indexName); + var requestBuilder = new ContinuePageRequestBuilder( + new OpenSearchRequest.IndexName(indexName), + scrollId, + new OpenSearchExprValueFactory(index.getFieldOpenSearchTypes())); + return new OpenSearchPagedIndexScan(client, requestBuilder); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java similarity index 75% rename from opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScan.java rename to opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java index e9746e1fae1..27529bdffd2 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java @@ -4,7 +4,7 @@ */ -package org.opensearch.sql.opensearch.storage; +package org.opensearch.sql.opensearch.storage.scan; import java.util.Collections; import java.util.Iterator; @@ -52,25 +52,9 @@ public class OpenSearchIndexScan extends TableScanOperator { /** Search response for current batch. */ private Iterator iterator; - /** - * Constructor. - */ - public OpenSearchIndexScan(OpenSearchClient client, Settings settings, - String indexName, Integer maxResultWindow, - OpenSearchExprValueFactory exprValueFactory) { - this(client, settings, - new OpenSearchRequest.IndexName(indexName),maxResultWindow, exprValueFactory); - } - - /** - * Constructor. - */ - public OpenSearchIndexScan(OpenSearchClient client, Settings settings, - OpenSearchRequest.IndexName indexName, Integer maxResultWindow, - OpenSearchExprValueFactory exprValueFactory) { + public OpenSearchIndexScan(OpenSearchClient client, OpenSearchRequestBuilder builder) { this.client = client; - this.requestBuilder = new OpenSearchRequestBuilder( - indexName, maxResultWindow, settings,exprValueFactory); + this.requestBuilder = builder; } @Override @@ -99,6 +83,12 @@ public ExprValue next() { return iterator.next(); } + @Override + public long getTotalHits() { + // ignore response.getTotalHits(), because response returns entire index, regardless of LIMIT + return queryCount; + } + private void fetchNextBatch() { OpenSearchResponse response = client.search(request); if (!response.isEmpty()) { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java index e52fc566cd6..4571961e5fe 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java @@ -15,10 +15,9 @@ import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; import org.opensearch.sql.opensearch.storage.script.aggregation.AggregationQueryBuilder; -import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.planner.logical.LogicalAggregation; import org.opensearch.sql.planner.logical.LogicalSort; import org.opensearch.sql.storage.TableScanOperator; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java index d7483cfcf06..41edbfc768a 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java @@ -8,7 +8,6 @@ import com.google.common.annotations.VisibleForTesting; import lombok.EqualsAndHashCode; import org.opensearch.sql.expression.ReferenceExpression; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; import org.opensearch.sql.planner.logical.LogicalAggregation; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java index 7190d580002..f2e5139d01d 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java @@ -20,10 +20,9 @@ import org.opensearch.sql.expression.ExpressionNodeVisitor; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.FilterQueryBuilder; import org.opensearch.sql.opensearch.storage.script.sort.SortQueryBuilder; -import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; @@ -62,7 +61,7 @@ public boolean pushDownFilter(LogicalFilter filter) { FilterQueryBuilder queryBuilder = new FilterQueryBuilder( new DefaultExpressionSerializer()); QueryBuilder query = queryBuilder.build(filter.getCondition()); - indexScan.getRequestBuilder().pushDown(query); + indexScan.getRequestBuilder().pushDownFilter(query); return true; } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java new file mode 100644 index 00000000000..e9d3fd52d39 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java @@ -0,0 +1,84 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.storage.scan; + +import java.util.Collections; +import java.util.Iterator; +import lombok.EqualsAndHashCode; +import lombok.ToString; +import org.apache.commons.lang3.NotImplementedException; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.request.PagedRequestBuilder; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; +import org.opensearch.sql.storage.TableScanOperator; + +@EqualsAndHashCode(onlyExplicitlyIncluded = true, callSuper = false) +@ToString(onlyExplicitlyIncluded = true) +public class OpenSearchPagedIndexScan extends TableScanOperator { + private final OpenSearchClient client; + private final PagedRequestBuilder requestBuilder; + @EqualsAndHashCode.Include + @ToString.Include + private OpenSearchRequest request; + private Iterator iterator; + private long totalHits = 0; + + public OpenSearchPagedIndexScan(OpenSearchClient client, + PagedRequestBuilder requestBuilder) { + this.client = client; + this.requestBuilder = requestBuilder; + } + + @Override + public String explain() { + throw new NotImplementedException("Implement OpenSearchPagedIndexScan.explain"); + } + + @Override + public boolean hasNext() { + return iterator.hasNext(); + } + + @Override + public ExprValue next() { + return iterator.next(); + } + + @Override + public void open() { + super.open(); + request = requestBuilder.build(); + OpenSearchResponse response = client.search(request); + if (!response.isEmpty()) { + iterator = response.iterator(); + totalHits = response.getTotalHits(); + } else { + iterator = Collections.emptyIterator(); + } + } + + @Override + public void close() { + super.close(); + client.cleanup(request); + } + + @Override + public long getTotalHits() { + return totalHits; + } + + @Override + public String toCursor() { + // TODO this assumes exactly one index is scanned. + var indexName = requestBuilder.getIndexName().getIndexNames()[0]; + var cursor = request.toCursor(); + return cursor == null || cursor.isEmpty() + ? "" : createSection("OpenSearchPagedIndexScan", indexName, cursor); + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanBuilder.java new file mode 100644 index 00000000000..779df4ebec9 --- /dev/null +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanBuilder.java @@ -0,0 +1,29 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.storage.scan; + +import lombok.EqualsAndHashCode; +import org.opensearch.sql.storage.TableScanOperator; +import org.opensearch.sql.storage.read.TableScanBuilder; + +/** + * Builder for a paged OpenSearch request. + * Override pushDown* methods from TableScanBuilder as more features + * support pagination. + */ +public class OpenSearchPagedIndexScanBuilder extends TableScanBuilder { + @EqualsAndHashCode.Include + OpenSearchPagedIndexScan indexScan; + + public OpenSearchPagedIndexScanBuilder(OpenSearchPagedIndexScan indexScan) { + this.indexScan = indexScan; + } + + @Override + public TableScanOperator build() { + return indexScan; + } +} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java index 855aae645d2..9e8b47f6b05 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java @@ -16,9 +16,9 @@ import org.opensearch.script.ScriptContext; import org.opensearch.script.ScriptEngine; import org.opensearch.sql.expression.Expression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.aggregation.ExpressionAggregationScriptFactory; import org.opensearch.sql.opensearch.storage.script.filter.ExpressionFilterScriptFactory; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Custom expression script engine that supports using core engine expression code in DSL diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java index 1efa5b65d5f..bc9741dee51 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java @@ -24,13 +24,12 @@ import org.opensearch.search.aggregations.bucket.missing.MissingOrder; import org.opensearch.search.sort.SortOrder; import org.opensearch.sql.ast.tree.Sort; -import org.opensearch.sql.data.type.ExprCoreType; -import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.ExpressionNodeVisitor; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.response.agg.CompositeAggregationParser; import org.opensearch.sql.opensearch.response.agg.MetricParser; @@ -38,7 +37,6 @@ import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.storage.script.aggregation.dsl.BucketAggregationBuilder; import org.opensearch.sql.opensearch.storage.script.aggregation.dsl.MetricAggregationBuilder; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Build the AggregationBuilder from the list of {@link NamedAggregator} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java index 156b565976a..83dd9276326 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java @@ -17,8 +17,8 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Abstract Aggregation Builder. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java index 1a6a82be966..215be3b3565 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java @@ -23,8 +23,8 @@ import org.opensearch.search.sort.SortOrder; import org.opensearch.sql.ast.expression.SpanUnit; import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.expression.span.SpanExpression; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Bucket Aggregation Builder. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java index 5e7d34abce0..db8d1fdf1eb 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java @@ -25,13 +25,13 @@ import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.response.agg.FilterParser; import org.opensearch.sql.opensearch.response.agg.MetricParser; import org.opensearch.sql.opensearch.response.agg.SingleValueParser; import org.opensearch.sql.opensearch.response.agg.StatsParser; import org.opensearch.sql.opensearch.response.agg.TopHitsParser; import org.opensearch.sql.opensearch.storage.script.filter.FilterQueryBuilder; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Build the Metric Aggregation and List of {@link MetricParser} from {@link NamedAggregator}. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java index 5f36954d4a7..a82869ec038 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java @@ -24,6 +24,7 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.function.BuiltinFunctionName; import org.opensearch.sql.expression.function.FunctionName; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.lucene.LikeQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.LuceneQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.RangeQuery; @@ -38,7 +39,6 @@ import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.QueryStringQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.SimpleQueryStringQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.WildcardQuery; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @RequiredArgsConstructor public class FilterQueryBuilder extends ExpressionNodeVisitor { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScan.java index eb4cb865e22..eba5eb126de 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScan.java @@ -31,9 +31,13 @@ public class OpenSearchSystemIndexScan extends TableScanOperator { */ private Iterator iterator; + private long totalHits = 0; + @Override public void open() { - iterator = request.search().iterator(); + var response = request.search(); + totalHits = response.size(); + iterator = response.iterator(); } @Override @@ -46,6 +50,11 @@ public ExprValue next() { return iterator.next(); } + @Override + public long getTotalHits() { + return totalHits; + } + @Override public String explain() { return request.toString(); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java index 1c79a28f3fa..77872296031 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java @@ -34,8 +34,12 @@ import java.util.List; import java.util.Map; import java.util.concurrent.atomic.AtomicBoolean; +import lombok.SneakyThrows; +import org.apache.commons.lang3.reflect.FieldUtils; import org.apache.lucene.search.TotalHits; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.InOrder; @@ -76,6 +80,7 @@ import org.opensearch.sql.opensearch.response.OpenSearchResponse; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchNodeClientTest { private static final String TEST_MAPPING_FILE = "mappings/accounts.json"; @@ -107,7 +112,7 @@ void setUp() { } @Test - void isIndexExist() { + void is_index_exist() { when(nodeClient.admin().indices() .exists(any(IndicesExistsRequest.class)).actionGet()) .thenReturn(new IndicesExistsResponse(true)); @@ -116,7 +121,7 @@ void isIndexExist() { } @Test - void isIndexNotExist() { + void is_index_not_exist() { String indexName = "test"; when(nodeClient.admin().indices() .exists(any(IndicesExistsRequest.class)).actionGet()) @@ -126,14 +131,14 @@ void isIndexNotExist() { } @Test - void isIndexExistWithException() { + void is_index_exist_with_exception() { when(nodeClient.admin().indices().exists(any())).thenThrow(RuntimeException.class); assertThrows(IllegalStateException.class, () -> client.exists("test")); } @Test - void createIndex() { + void create_index() { String indexName = "test"; Map mappings = ImmutableMap.of( "properties", @@ -146,7 +151,7 @@ void createIndex() { } @Test - void createIndexWithException() { + void create_index_with_exception() { when(nodeClient.admin().indices().create(any())).thenThrow(RuntimeException.class); assertThrows(IllegalStateException.class, @@ -154,7 +159,7 @@ void createIndexWithException() { } @Test - void getIndexMappings() throws IOException { + void get_index_mappings() throws IOException { URL url = Resources.getResource(TEST_MAPPING_FILE); String mappings = Resources.toString(url, Charsets.UTF_8); String indexName = "test"; @@ -225,7 +230,7 @@ void getIndexMappings() throws IOException { } @Test - void getIndexMappingsWithEmptyMapping() { + void get_index_mappings_with_empty_mapping() { String indexName = "test"; mockNodeClientIndicesMappings(indexName, ""); Map indexMappings = client.getIndexMappings(indexName); @@ -236,7 +241,7 @@ void getIndexMappingsWithEmptyMapping() { } @Test - void getIndexMappingsWithIOException() { + void get_index_mappings_with_IOException() { String indexName = "test"; when(nodeClient.admin().indices()).thenThrow(RuntimeException.class); @@ -244,7 +249,7 @@ void getIndexMappingsWithIOException() { } @Test - void getIndexMappingsWithNonExistIndex() { + void get_index_mappings_with_non_exist_index() { when(nodeClient.admin().indices() .prepareGetMappings(any()) .setLocal(anyBoolean()) @@ -255,7 +260,7 @@ void getIndexMappingsWithNonExistIndex() { } @Test - void getIndexMaxResultWindows() throws IOException { + void get_index_max_result_windows() throws IOException { URL url = Resources.getResource(TEST_MAPPING_SETTINGS_FILE); String indexMetadata = Resources.toString(url, Charsets.UTF_8); String indexName = "accounts"; @@ -269,7 +274,7 @@ void getIndexMaxResultWindows() throws IOException { } @Test - void getIndexMaxResultWindowsWithDefaultSettings() throws IOException { + void get_index_max_result_windows_with_default_settings() throws IOException { URL url = Resources.getResource(TEST_MAPPING_FILE); String indexMetadata = Resources.toString(url, Charsets.UTF_8); String indexName = "accounts"; @@ -283,7 +288,7 @@ void getIndexMaxResultWindowsWithDefaultSettings() throws IOException { } @Test - void getIndexMaxResultWindowsWithIOException() { + void get_index_max_result_windows_with_IOException() { String indexName = "test"; when(nodeClient.admin().indices()).thenThrow(RuntimeException.class); @@ -292,7 +297,7 @@ void getIndexMaxResultWindowsWithIOException() { /** Jacoco enforce this constant lambda be tested. */ @Test - void testAllFieldsPredicate() { + void test_all_fields_predicate() { assertTrue(OpenSearchNodeClient.ALL_FIELDS.apply("any_index").test("any_field")); } @@ -314,7 +319,6 @@ void search() { // Mock second scroll request followed SearchResponse scrollResponse = mock(SearchResponse.class); when(nodeClient.searchScroll(any()).actionGet()).thenReturn(scrollResponse); - when(scrollResponse.getScrollId()).thenReturn("scroll456"); when(scrollResponse.getHits()).thenReturn(SearchHits.empty()); // Verify response for first scroll request @@ -328,6 +332,7 @@ void search() { assertFalse(hits.hasNext()); // Verify response for second scroll request + request.setScrollId("scroll123"); OpenSearchResponse response2 = client.search(request); assertTrue(response2.isEmpty()); } @@ -343,6 +348,7 @@ void schedule() { } @Test + @SneakyThrows void cleanup() { ClearScrollRequestBuilder requestBuilder = mock(ClearScrollRequestBuilder.class); when(nodeClient.prepareClearScroll()).thenReturn(requestBuilder); @@ -351,8 +357,10 @@ void cleanup() { OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); request.setScrollId("scroll123"); + // Enforce cleaning by setting a private field. + FieldUtils.writeField(request, "needClean", true, true); client.cleanup(request); - assertFalse(request.isScrollStarted()); + assertFalse(request.isScroll()); InOrder inOrder = Mockito.inOrder(nodeClient, requestBuilder); inOrder.verify(nodeClient).prepareClearScroll(); @@ -361,14 +369,26 @@ void cleanup() { } @Test - void cleanupWithoutScrollId() { + void cleanup_without_scrollId() { OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); client.cleanup(request); verify(nodeClient, never()).prepareClearScroll(); } @Test - void getIndices() { + @SneakyThrows + void cleanup_rethrows_exception() { + when(nodeClient.prepareClearScroll()).thenThrow(new RuntimeException()); + + OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + request.setScrollId("scroll123"); + // Enforce cleaning by setting a private field. + FieldUtils.writeField(request, "needClean", true, true); + assertThrows(IllegalStateException.class, () -> client.cleanup(request)); + } + + @Test + void get_indices() { AliasMetadata aliasMetadata = mock(AliasMetadata.class); ImmutableOpenMap.Builder> builder = ImmutableOpenMap.builder(); builder.fPut("index",Arrays.asList(aliasMetadata)); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java index f2da6fd1e07..b8920e52a66 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java @@ -30,8 +30,12 @@ import java.util.List; import java.util.Map; import java.util.concurrent.atomic.AtomicBoolean; +import lombok.SneakyThrows; +import org.apache.commons.lang3.reflect.FieldUtils; import org.apache.lucene.search.TotalHits; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -68,6 +72,7 @@ import org.opensearch.sql.opensearch.response.OpenSearchResponse; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchRestClientTest { private static final String TEST_MAPPING_FILE = "mappings/accounts.json"; @@ -95,7 +100,7 @@ void setUp() { } @Test - void isIndexExist() throws IOException { + void is_index_exist() throws IOException { when(restClient.indices() .exists(any(), any())) // use any() because missing equals() in GetIndexRequest .thenReturn(true); @@ -104,7 +109,7 @@ void isIndexExist() throws IOException { } @Test - void isIndexNotExist() throws IOException { + void is_index_not_exist() throws IOException { when(restClient.indices() .exists(any(), any())) // use any() because missing equals() in GetIndexRequest .thenReturn(false); @@ -113,14 +118,14 @@ void isIndexNotExist() throws IOException { } @Test - void isIndexExistWithException() throws IOException { + void is_index_exist_with_exception() throws IOException { when(restClient.indices().exists(any(), any())).thenThrow(IOException.class); assertThrows(IllegalStateException.class, () -> client.exists("test")); } @Test - void createIndex() throws IOException { + void create_index() throws IOException { String indexName = "test"; Map mappings = ImmutableMap.of( "properties", @@ -133,7 +138,7 @@ void createIndex() throws IOException { } @Test - void createIndexWithIOException() throws IOException { + void create_index_with_IOException() throws IOException { when(restClient.indices().create(any(), any())).thenThrow(IOException.class); assertThrows(IllegalStateException.class, @@ -141,7 +146,7 @@ void createIndexWithIOException() throws IOException { } @Test - void getIndexMappings() throws IOException { + void get_index_mappings() throws IOException { URL url = Resources.getResource(TEST_MAPPING_FILE); String mappings = Resources.toString(url, Charsets.UTF_8); String indexName = "test"; @@ -216,14 +221,14 @@ void getIndexMappings() throws IOException { } @Test - void getIndexMappingsWithIOException() throws IOException { + void get_index_mappings_with_IOException() throws IOException { when(restClient.indices().getMapping(any(GetMappingsRequest.class), any())) .thenThrow(new IOException()); assertThrows(IllegalStateException.class, () -> client.getIndexMappings("test")); } @Test - void getIndexMaxResultWindowsSettings() throws IOException { + void get_index_max_result_windows_settings() throws IOException { String indexName = "test"; Integer maxResultWindow = 1000; @@ -247,7 +252,7 @@ void getIndexMaxResultWindowsSettings() throws IOException { } @Test - void getIndexMaxResultWindowsDefaultSettings() throws IOException { + void get_index_max_result_windows_default_settings() throws IOException { String indexName = "test"; Integer maxResultWindow = 10000; @@ -271,7 +276,7 @@ void getIndexMaxResultWindowsDefaultSettings() throws IOException { } @Test - void getIndexMaxResultWindowsWithIOException() throws IOException { + void get_index_max_result_windows_with_IOException() throws IOException { when(restClient.indices().getSettings(any(GetSettingsRequest.class), any())) .thenThrow(new IOException()); assertThrows(IllegalStateException.class, () -> client.getIndexMaxResultWindows("test")); @@ -295,7 +300,6 @@ void search() throws IOException { // Mock second scroll request followed SearchResponse scrollResponse = mock(SearchResponse.class); when(restClient.scroll(any(), any())).thenReturn(scrollResponse); - when(scrollResponse.getScrollId()).thenReturn("scroll456"); when(scrollResponse.getHits()).thenReturn(SearchHits.empty()); // Verify response for first scroll request @@ -309,12 +313,13 @@ void search() throws IOException { assertFalse(hits.hasNext()); // Verify response for second scroll request + request.setScrollId("scroll123"); OpenSearchResponse response2 = client.search(request); assertTrue(response2.isEmpty()); } @Test - void searchWithIOException() throws IOException { + void search_with_IOException() throws IOException { when(restClient.search(any(), any())).thenThrow(new IOException()); assertThrows( IllegalStateException.class, @@ -322,7 +327,7 @@ void searchWithIOException() throws IOException { } @Test - void scrollWithIOException() throws IOException { + void scroll_with_IOException() throws IOException { // Mock first scroll request SearchResponse searchResponse = mock(SearchResponse.class); when(restClient.search(any(), any())).thenReturn(searchResponse); @@ -355,32 +360,38 @@ void schedule() { } @Test - void cleanup() throws IOException { + @SneakyThrows + void cleanup() { OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + // Enforce cleaning by setting a private field. + FieldUtils.writeField(request, "needClean", true, true); request.setScrollId("scroll123"); client.cleanup(request); verify(restClient).clearScroll(any(), any()); - assertFalse(request.isScrollStarted()); + assertFalse(request.isScroll()); } @Test - void cleanupWithoutScrollId() throws IOException { + void cleanup_without_scrollId() throws IOException { OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); client.cleanup(request); verify(restClient, never()).clearScroll(any(), any()); } @Test - void cleanupWithIOException() throws IOException { + @SneakyThrows + void cleanup_with_IOException() { when(restClient.clearScroll(any(), any())).thenThrow(new IOException()); OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + // Enforce cleaning by setting a private field. + FieldUtils.writeField(request, "needClean", true, true); request.setScrollId("scroll123"); assertThrows(IllegalStateException.class, () -> client.cleanup(request)); } @Test - void getIndices() throws IOException { + void get_indices() throws IOException { when(restClient.indices().get(any(GetIndexRequest.class), any(RequestOptions.class))) .thenReturn(getIndexResponse); when(getIndexResponse.getIndices()).thenReturn(new String[] {"index"}); @@ -390,7 +401,7 @@ void getIndices() throws IOException { } @Test - void getIndicesWithIOException() throws IOException { + void get_indices_with_IOException() throws IOException { when(restClient.indices().get(any(GetIndexRequest.class), any(RequestOptions.class))) .thenThrow(new IOException()); assertThrows(IllegalStateException.class, () -> client.indices()); @@ -409,7 +420,7 @@ void meta() throws IOException { } @Test - void metaWithIOException() throws IOException { + void meta_with_IOException() throws IOException { when(restClient.cluster().getSettings(any(), any(RequestOptions.class))) .thenThrow(new IOException()); @@ -417,7 +428,7 @@ void metaWithIOException() throws IOException { } @Test - void mlWithException() { + void ml_with_exception() { assertThrows(UnsupportedOperationException.class, () -> client.getNodeClient()); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index 4a0c6e24f1c..b6b0269625a 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -25,11 +25,12 @@ import java.util.Arrays; import java.util.Iterator; import java.util.List; -import java.util.Map; import java.util.Optional; import java.util.concurrent.atomic.AtomicReference; import lombok.RequiredArgsConstructor; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -40,15 +41,19 @@ import org.opensearch.sql.executor.ExecutionContext; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.ExecutionEngine.ExplainResponse; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; +import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; +import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.TableScanOperator; import org.opensearch.sql.storage.split.Split; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchExecutionEngineTest { @Mock private OpenSearchClient client; @@ -75,14 +80,15 @@ void setUp() { } @Test - void executeSuccessfully() { + void execute_successfully() { List expected = Arrays.asList( tupleValue(of("name", "John", "age", 20)), tupleValue(of("name", "Allen", "age", 30))); FakePhysicalPlan plan = new FakePhysicalPlan(expected.iterator()); when(protector.protect(plan)).thenReturn(plan); - OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector); + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); List actual = new ArrayList<>(); executor.execute( plan, @@ -104,13 +110,43 @@ public void onFailure(Exception e) { } @Test - void executeWithFailure() { + void execute_with_cursor() { + List expected = + Arrays.asList( + tupleValue(of("name", "John", "age", 20)), tupleValue(of("name", "Allen", "age", 30))); + FakePaginatePlan plan = new FakePaginatePlan(new FakePhysicalPlan(expected.iterator()), 10, 0); + when(protector.protect(plan)).thenReturn(plan); + + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); + List actual = new ArrayList<>(); + executor.execute( + plan, + new ResponseListener() { + @Override + public void onResponse(QueryResponse response) { + actual.addAll(response.getResults()); + assertTrue(response.getCursor().toString().startsWith("n:")); + } + + @Override + public void onFailure(Exception e) { + fail("Error occurred during execution", e); + } + }); + + assertEquals(expected, actual); + } + + @Test + void execute_with_failure() { PhysicalPlan plan = mock(PhysicalPlan.class); RuntimeException expected = new RuntimeException("Execution error"); when(plan.hasNext()).thenThrow(expected); when(protector.protect(plan)).thenReturn(plan); - OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector); + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); AtomicReference actual = new AtomicReference<>(); executor.execute( plan, @@ -130,12 +166,14 @@ public void onFailure(Exception e) { } @Test - void explainSuccessfully() { - OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector); + void explain_successfully() { + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); Settings settings = mock(Settings.class); when(settings.getSettingValue(QUERY_SIZE_LIMIT)).thenReturn(100); PhysicalPlan plan = new OpenSearchIndexScan(mock(OpenSearchClient.class), - settings, "test", 10000, mock(OpenSearchExprValueFactory.class)); + new OpenSearchRequestBuilder("test", 10000, settings, + mock(OpenSearchExprValueFactory.class))); AtomicReference result = new AtomicReference<>(); executor.explain(plan, new ResponseListener() { @@ -154,8 +192,9 @@ public void onFailure(Exception e) { } @Test - void explainWithFailure() { - OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector); + void explain_with_failure() { + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); PhysicalPlan plan = mock(PhysicalPlan.class); when(plan.accept(any(), any())).thenThrow(IllegalStateException.class); @@ -176,7 +215,7 @@ public void onFailure(Exception e) { } @Test - void callAddSplitAndOpenInOrder() { + void call_add_split_and_open_in_order() { List expected = Arrays.asList( tupleValue(of("name", "John", "age", 20)), tupleValue(of("name", "Allen", "age", 30))); @@ -184,7 +223,8 @@ void callAddSplitAndOpenInOrder() { when(protector.protect(plan)).thenReturn(plan); when(executionContext.getSplit()).thenReturn(Optional.of(split)); - OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector); + OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, + new PaginatedPlanCache(null)); List actual = new ArrayList<>(); executor.execute( plan, @@ -207,6 +247,54 @@ public void onFailure(Exception e) { assertTrue(plan.hasClosed); } + private static class FakePaginatePlan extends PaginateOperator { + private final PhysicalPlan input; + private final int pageSize; + private final int pageIndex; + + public FakePaginatePlan(PhysicalPlan input, int pageSize, int pageIndex) { + super(input, pageSize, pageIndex); + this.input = input; + this.pageSize = pageSize; + this.pageIndex = pageIndex; + } + + @Override + public void open() { + input.open(); + } + + @Override + public void close() { + input.close(); + } + + @Override + public void add(Split split) { + input.add(split); + } + + @Override + public boolean hasNext() { + return input.hasNext(); + } + + @Override + public ExprValue next() { + return input.next(); + } + + @Override + public ExecutionEngine.Schema schema() { + return input.schema(); + } + + @Override + public String toCursor() { + return "FakePaginatePlan"; + } + } + @RequiredArgsConstructor private static class FakePhysicalPlan extends TableScanOperator { private final Iterator it; diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java index d4d987a7df2..7b1353f4a97 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java @@ -107,4 +107,16 @@ void acceptSuccess() { monitorPlan.accept(visitor, context); verify(plan, times(1)).accept(visitor, context); } + + @Test + void getTotalHitsSuccess() { + monitorPlan.getTotalHits(); + verify(plan, times(1)).getTotalHits(); + } + + @Test + void toCursorSuccess() { + monitorPlan.toCursor(); + verify(plan, times(1)).toCursor(); + } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java index 857ff601e14..d0e486fae9c 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java @@ -57,8 +57,10 @@ import org.opensearch.sql.opensearch.planner.physical.ADOperator; import org.opensearch.sql.opensearch.planner.physical.MLCommonsOperator; import org.opensearch.sql.opensearch.planner.physical.MLOperator; +import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.setting.OpenSearchSettings; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; +import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; @@ -124,9 +126,11 @@ public void testProtectIndexScan() { PhysicalPlanDSL.agg( filter( resourceMonitor( - new OpenSearchIndexScan( - client, settings, indexName, - maxResultWindow, exprValueFactory)), + new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder(indexName, + maxResultWindow, + settings, + exprValueFactory))), filterExpr), aggregators, groupByExprs), @@ -152,9 +156,11 @@ public void testProtectIndexScan() { PhysicalPlanDSL.rename( PhysicalPlanDSL.agg( filter( - new OpenSearchIndexScan( - client, settings, indexName, - maxResultWindow, exprValueFactory), + new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder(indexName, + maxResultWindow, + settings, + exprValueFactory)), filterExpr), aggregators, groupByExprs), @@ -314,6 +320,13 @@ public void testVisitML() { executionProtector.visitML(mlOperator, null)); } + @Test + public void visitPaginate() { + var paginate = new PaginateOperator(values(List.of()), 42); + assertEquals(executionProtector.protect(paginate), + executionProtector.visitPaginate(paginate, null)); + } + PhysicalPlan resourceMonitor(PhysicalPlan input) { return new ResourceMonitorPlan(input, resourceMonitor); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java new file mode 100644 index 00000000000..d549ed9200f --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java @@ -0,0 +1,48 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import static org.junit.jupiter.api.Assertions.assertEquals; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +@ExtendWith(MockitoExtension.class) +public class ContinuePageRequestBuilderTest { + + @Mock + private OpenSearchExprValueFactory exprValueFactory; + + private final OpenSearchRequest.IndexName indexName = new OpenSearchRequest.IndexName("test"); + private final String scrollId = "scroll"; + + private ContinuePageRequestBuilder requestBuilder; + + @BeforeEach + void setup() { + requestBuilder = new ContinuePageRequestBuilder(indexName, scrollId, exprValueFactory); + } + + @Test + public void build() { + assertEquals( + new ContinuePageRequest(scrollId, exprValueFactory), + requestBuilder.build() + ); + } + + @Test + public void getIndexName() { + assertEquals(indexName, requestBuilder.getIndexName()); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java new file mode 100644 index 00000000000..32a15f5e8c6 --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java @@ -0,0 +1,124 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNull; +import static org.junit.jupiter.api.Assertions.assertSame; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +import java.util.function.Consumer; +import java.util.function.Function; +import lombok.SneakyThrows; +import org.apache.commons.lang3.reflect.FieldUtils; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.search.SearchRequest; +import org.opensearch.action.search.SearchResponse; +import org.opensearch.action.search.SearchScrollRequest; +import org.opensearch.search.SearchHit; +import org.opensearch.search.SearchHits; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; + +@ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class ContinuePageRequestTest { + + @Mock + private Function searchAction; + + @Mock + private Function scrollAction; + + @Mock + private Consumer cleanAction; + + @Mock + private SearchResponse searchResponse; + + @Mock + private SearchHits searchHits; + + @Mock + private SearchHit searchHit; + + @Mock + private OpenSearchExprValueFactory factory; + + private final String scroll = "scroll"; + private final String nextScroll = "nextScroll"; + + private final ContinuePageRequest request = new ContinuePageRequest(scroll, factory); + + @Test + public void search_with_non_empty_response() { + when(scrollAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); + when(searchResponse.getScrollId()).thenReturn(nextScroll); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + assertAll( + () -> assertFalse(searchResponse.isEmpty()), + () -> assertEquals(nextScroll, request.toCursor()), + () -> verify(scrollAction, times(1)).apply(any()), + () -> verify(searchAction, never()).apply(any()) + ); + } + + @Test + // Empty response means scroll search is done and no cursor/scroll should be set + public void search_with_empty_response() { + when(scrollAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(null); + lenient().when(searchResponse.getScrollId()).thenReturn(nextScroll); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + assertAll( + () -> assertTrue(searchResponse.isEmpty()), + () -> assertNull(request.toCursor()), + () -> verify(scrollAction, times(1)).apply(any()), + () -> verify(searchAction, never()).apply(any()) + ); + } + + @Test + @SneakyThrows + public void clean() { + request.clean(cleanAction); + verify(cleanAction, never()).accept(any()); + // Enforce cleaning by setting a private field. + FieldUtils.writeField(request, "scrollFinished", true, true); + request.clean(cleanAction); + verify(cleanAction, times(1)).accept(any()); + } + + @Test + // Added for coverage only + public void getters() { + factory = mock(); + assertAll( + () -> assertThrows(Throwable.class, request::getSourceBuilder), + () -> assertSame(factory, new ContinuePageRequest("", factory).getExprValueFactory()) + ); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java new file mode 100644 index 00000000000..beebb6a0ac5 --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java @@ -0,0 +1,109 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.request; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; +import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; + +import java.util.Map; +import java.util.Set; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.sql.data.type.ExprType; +import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +@ExtendWith(MockitoExtension.class) +public class InitialPageRequestBuilderTest { + + @Mock + private OpenSearchExprValueFactory exprValueFactory; + + private final int pageSize = 42; + + private final OpenSearchRequest.IndexName indexName = new OpenSearchRequest.IndexName("test"); + + private InitialPageRequestBuilder requestBuilder; + + @BeforeEach + void setup() { + requestBuilder = new InitialPageRequestBuilder( + indexName, pageSize, exprValueFactory); + } + + @Test + public void build() { + assertEquals( + new OpenSearchScrollRequest(indexName, + new SearchSourceBuilder() + .from(0) + .size(pageSize) + .timeout(DEFAULT_QUERY_TIMEOUT), + exprValueFactory), + requestBuilder.build() + ); + } + + @Test + public void pushDown_not_supported() { + assertAll( + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownFilter(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownAggregation(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownSort(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownLimit(1, 2)), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownHighlight("", Map.of())) + ); + } + + @Test + public void pushTypeMapping() { + Map typeMapping = Map.of("intA", OpenSearchDataType.of(INTEGER)); + requestBuilder.pushTypeMapping(typeMapping); + + verify(exprValueFactory).extendTypeMapping(typeMapping); + } + + @Test + public void pushDownProject() { + Set references = Set.of(DSL.ref("intA", INTEGER)); + requestBuilder.pushDownProjects(references); + + assertEquals( + new OpenSearchScrollRequest(indexName, + new SearchSourceBuilder() + .from(0) + .size(pageSize) + .timeout(DEFAULT_QUERY_TIMEOUT) + .fetchSource(new String[]{"intA"}, new String[0]), + exprValueFactory), + requestBuilder.build() + ); + } + + @Test + public void getIndexName() { + assertEquals(indexName, requestBuilder.getIndexName()); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java index 1ba26e33dc0..c6a9a06a70d 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java @@ -14,6 +14,7 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; +import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; import java.util.function.Consumer; import java.util.function.Function; @@ -85,7 +86,7 @@ void searchRequest() { new SearchRequest() .indices("test") .source(new SearchSourceBuilder() - .timeout(OpenSearchQueryRequest.DEFAULT_QUERY_TIMEOUT) + .timeout(DEFAULT_QUERY_TIMEOUT) .from(0) .size(200) .query(QueryBuilders.termQuery("name", "John"))), diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java index 980d68ed80b..636142207eb 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java @@ -19,6 +19,8 @@ import java.util.Set; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -44,6 +46,7 @@ import org.opensearch.sql.opensearch.response.agg.SingleValueParser; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class OpenSearchRequestBuilderTest { private static final TimeValue DEFAULT_QUERY_TIMEOUT = TimeValue.timeValueMinutes(1L); @@ -68,7 +71,7 @@ void setup() { } @Test - void buildQueryRequest() { + void build_query_request() { Integer limit = 200; Integer offset = 0; requestBuilder.pushDownLimit(limit, offset); @@ -85,7 +88,7 @@ void buildQueryRequest() { } @Test - void buildScrollRequestWithCorrectSize() { + void build_scroll_request_with_correct_size() { Integer limit = 800; Integer offset = 10; requestBuilder.pushDownLimit(limit, offset); @@ -102,9 +105,9 @@ void buildScrollRequestWithCorrectSize() { } @Test - void testPushDownQuery() { + void test_push_down_query() { QueryBuilder query = QueryBuilders.termQuery("intA", 1); - requestBuilder.pushDown(query); + requestBuilder.pushDownFilter(query); assertEquals( new SearchSourceBuilder() @@ -118,7 +121,7 @@ void testPushDownQuery() { } @Test - void testPushDownAggregation() { + void test_push_down_aggregation() { AggregationBuilder aggBuilder = AggregationBuilders.composite( "composite_buckets", Collections.singletonList(new TermsValuesSourceBuilder("longA"))); @@ -139,9 +142,9 @@ void testPushDownAggregation() { } @Test - void testPushDownQueryAndSort() { + void test_push_down_query_and_sort() { QueryBuilder query = QueryBuilders.termQuery("intA", 1); - requestBuilder.pushDown(query); + requestBuilder.pushDownFilter(query); FieldSortBuilder sortBuilder = SortBuilders.fieldSort("intA"); requestBuilder.pushDownSort(List.of(sortBuilder)); @@ -157,7 +160,7 @@ void testPushDownQueryAndSort() { } @Test - void testPushDownSort() { + void test_push_down_sort() { FieldSortBuilder sortBuilder = SortBuilders.fieldSort("intA"); requestBuilder.pushDownSort(List.of(sortBuilder)); @@ -171,7 +174,7 @@ void testPushDownSort() { } @Test - void testPushDownNonFieldSort() { + void test_push_down_non_field_sort() { ScoreSortBuilder sortBuilder = SortBuilders.scoreSort(); requestBuilder.pushDownSort(List.of(sortBuilder)); @@ -185,7 +188,7 @@ void testPushDownNonFieldSort() { } @Test - void testPushDownMultipleSort() { + void test_push_down_multiple_sort() { requestBuilder.pushDownSort(List.of( SortBuilders.fieldSort("intA"), SortBuilders.fieldSort("intB"))); @@ -201,7 +204,7 @@ void testPushDownMultipleSort() { } @Test - void testPushDownProject() { + void test_push_down_project() { Set references = Set.of(DSL.ref("intA", INTEGER)); requestBuilder.pushDownProjects(references); @@ -215,7 +218,7 @@ void testPushDownProject() { } @Test - void testPushTypeMapping() { + void test_push_type_mapping() { Map typeMapping = Map.of("intA", OpenSearchDataType.of(INTEGER)); requestBuilder.pushTypeMapping(typeMapping); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestTest.java new file mode 100644 index 00000000000..d0a274ce2a2 --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestTest.java @@ -0,0 +1,23 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + + +package org.opensearch.sql.opensearch.request; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.withSettings; + +import org.junit.jupiter.api.Test; + +public class OpenSearchRequestTest { + + @Test + void toCursor() { + var request = mock(OpenSearchRequest.class, withSettings().defaultAnswer(CALLS_REAL_METHODS)); + assertEquals("", request.toCursor()); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java index 0fc9c928106..6e45476306e 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java @@ -8,19 +8,31 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; +import java.util.concurrent.atomic.AtomicBoolean; +import org.apache.lucene.search.TotalHits; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.action.search.SearchRequest; +import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchScrollRequest; import org.opensearch.index.query.QueryBuilders; +import org.opensearch.search.SearchHit; +import org.opensearch.search.SearchHits; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchScrollRequestTest { @Mock @@ -43,10 +55,13 @@ void searchRequest() { @Test void isScrollStarted() { - assertFalse(request.isScrollStarted()); + assertFalse(request.isScroll()); request.setScrollId("scroll123"); - assertTrue(request.isScrollStarted()); + assertTrue(request.isScroll()); + + request.reset(); + assertFalse(request.isScroll()); } @Test @@ -58,4 +73,60 @@ void scrollRequest() { .scrollId("scroll123"), request.scrollRequest()); } + + @Test + void toCursor() { + request.setScrollId("scroll123"); + assertEquals("scroll123", request.toCursor()); + + request.reset(); + assertNull(request.toCursor()); + } + + @Test + void clean_on_empty_response() { + // This could happen on sequential search calls + SearchResponse searchResponse = mock(); + when(searchResponse.getScrollId()).thenReturn("scroll1", "scroll2"); + when(searchResponse.getHits()).thenReturn( + new SearchHits(new SearchHit[1], new TotalHits(1, TotalHits.Relation.EQUAL_TO), 1F), + new SearchHits(new SearchHit[0], new TotalHits(0, TotalHits.Relation.EQUAL_TO), 1F)); + + request.search((x) -> searchResponse, (x) -> searchResponse); + assertEquals("scroll1", request.getScrollId()); + request.search((x) -> searchResponse, (x) -> searchResponse); + assertEquals("scroll1", request.getScrollId()); + + AtomicBoolean cleanCalled = new AtomicBoolean(false); + request.clean((s) -> cleanCalled.set(true)); + + assertNull(request.getScrollId()); + assertTrue(cleanCalled.get()); + } + + @Test + void no_clean_on_non_empty_response() { + SearchResponse searchResponse = mock(); + when(searchResponse.getScrollId()).thenReturn("scroll"); + when(searchResponse.getHits()).thenReturn( + new SearchHits(new SearchHit[1], new TotalHits(1, TotalHits.Relation.EQUAL_TO), 1F)); + + request.search((x) -> searchResponse, (x) -> searchResponse); + assertEquals("scroll", request.getScrollId()); + + request.clean((s) -> fail()); + assertNull(request.getScrollId()); + } + + @Test + void no_clean_if_no_scroll_in_response() { + SearchResponse searchResponse = mock(); + when(searchResponse.getHits()).thenReturn( + new SearchHits(new SearchHit[0], new TotalHits(0, TotalHits.Relation.EQUAL_TO), 1F)); + + request.search((x) -> searchResponse, (x) -> searchResponse); + assertNull(request.getScrollId()); + + request.clean((s) -> fail()); + } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java new file mode 100644 index 00000000000..8112de197ad --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java @@ -0,0 +1,44 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + + +package org.opensearch.sql.opensearch.request; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.withSettings; + +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; + +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class PushDownRequestBuilderTest { + + @Test + public void throw_unsupported2() { + var builder = mock(PushDownRequestBuilder.class, + withSettings().defaultAnswer(CALLS_REAL_METHODS)); + + assertAll( + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownFilter(null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownAggregation(null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownSort(null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownLimit(null, null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownHighlight(null, null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushDownProjects(null)), + () -> assertThrows(UnsupportedOperationException.class, () -> + builder.pushTypeMapping(null)) + ); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java index 0a60503415d..2d1d6145f3f 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java @@ -74,20 +74,29 @@ void isEmpty() { new TotalHits(2L, TotalHits.Relation.EQUAL_TO), 1.0F)); - assertFalse(new OpenSearchResponse(searchResponse, factory).isEmpty()); + var response = new OpenSearchResponse(searchResponse, factory); + assertFalse(response.isEmpty()); + assertEquals(2L, response.getTotalHits()); when(searchResponse.getHits()).thenReturn(SearchHits.empty()); when(searchResponse.getAggregations()).thenReturn(null); - assertTrue(new OpenSearchResponse(searchResponse, factory).isEmpty()); + + response = new OpenSearchResponse(searchResponse, factory); + assertTrue(response.isEmpty()); + assertEquals(0L, response.getTotalHits()); when(searchResponse.getHits()) .thenReturn(new SearchHits(null, new TotalHits(0, TotalHits.Relation.EQUAL_TO), 0)); - OpenSearchResponse response3 = new OpenSearchResponse(searchResponse, factory); - assertTrue(response3.isEmpty()); + response = new OpenSearchResponse(searchResponse, factory); + assertTrue(response.isEmpty()); + assertEquals(0L, response.getTotalHits()); when(searchResponse.getHits()).thenReturn(SearchHits.empty()); when(searchResponse.getAggregations()).thenReturn(new Aggregations(emptyList())); - assertFalse(new OpenSearchResponse(searchResponse, factory).isEmpty()); + + response = new OpenSearchResponse(searchResponse, factory); + assertFalse(response.isEmpty()); + assertEquals(0L, response.getTotalHits()); } @Test @@ -104,7 +113,8 @@ void iterator() { when(factory.construct(any())).thenReturn(exprTupleValue1).thenReturn(exprTupleValue2); int i = 0; - for (ExprValue hit : new OpenSearchResponse(searchResponse, factory)) { + var response = new OpenSearchResponse(searchResponse, factory); + for (ExprValue hit : response) { if (i == 0) { assertEquals(exprTupleValue1, hit); } else if (i == 1) { @@ -114,6 +124,7 @@ void iterator() { } i++; } + assertEquals(2L, response.getTotalHits()); } @Test diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java index 8d4dad48a99..6705c1ef022 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java @@ -56,6 +56,12 @@ import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.mapping.IndexMapping; +import org.opensearch.sql.opensearch.request.InitialPageRequestBuilder; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; +import org.opensearch.sql.opensearch.request.PagedRequestBuilder; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanDSL; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; @@ -190,9 +196,21 @@ void implementRelationOperatorOnly() { LogicalPlan plan = index.createScanBuilder(); Integer maxResultWindow = index.getMaxResultWindow(); - assertEquals( - new OpenSearchIndexScan(client, settings, indexName, maxResultWindow, exprValueFactory), - index.implement(plan)); + OpenSearchRequestBuilder + builder = new OpenSearchRequestBuilder(indexName, maxResultWindow, + settings, exprValueFactory); + assertEquals(new OpenSearchIndexScan(client, builder), index.implement(plan)); + } + + @Test + void implementPagedRelationOperatorOnly() { + when(client.getIndexMaxResultWindows("test")).thenReturn(Map.of("test", 10000)); + + LogicalPlan plan = index.createPagedScanBuilder(42); + Integer maxResultWindow = index.getMaxResultWindow(); + PagedRequestBuilder builder = new InitialPageRequestBuilder( + new OpenSearchRequest.IndexName(indexName), maxResultWindow, exprValueFactory); + assertEquals(new OpenSearchPagedIndexScan(client, builder), index.implement(plan)); } @Test @@ -202,8 +220,11 @@ void implementRelationOperatorWithOptimization() { LogicalPlan plan = index.createScanBuilder(); Integer maxResultWindow = index.getMaxResultWindow(); + OpenSearchRequestBuilder + builder = new OpenSearchRequestBuilder(indexName, maxResultWindow, + settings, exprValueFactory); assertEquals( - new OpenSearchIndexScan(client, settings, indexName, maxResultWindow, exprValueFactory), + new OpenSearchIndexScan(client, builder), index.implement(index.optimize(plan))); } @@ -251,8 +272,10 @@ void implementOtherLogicalOperators() { PhysicalPlanDSL.eval( PhysicalPlanDSL.remove( PhysicalPlanDSL.rename( - new OpenSearchIndexScan(client, settings, indexName, - maxResultWindow, exprValueFactory), + new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder( + indexName, maxResultWindow, + settings, exprValueFactory)), mappings), exclude), newEvalField), diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java index ab87f4531cf..6a8727e0fbc 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java @@ -6,11 +6,18 @@ package org.opensearch.sql.opensearch.storage; +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; import static org.opensearch.sql.utils.SystemIndexUtils.TABLE_INFO; +import java.util.Map; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -18,6 +25,8 @@ import org.opensearch.sql.DataSourceSchemaName; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; +import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; import org.opensearch.sql.opensearch.storage.system.OpenSearchSystemIndex; import org.opensearch.sql.storage.Table; @@ -35,7 +44,10 @@ public void getTable() { OpenSearchStorageEngine engine = new OpenSearchStorageEngine(client, settings); Table table = engine.getTable(new DataSourceSchemaName(DEFAULT_DATASOURCE_NAME, "default"), "test"); - assertNotNull(table); + assertAll( + () -> assertNotNull(table), + () -> assertTrue(table instanceof OpenSearchIndex) + ); } @Test @@ -43,7 +55,26 @@ public void getSystemTable() { OpenSearchStorageEngine engine = new OpenSearchStorageEngine(client, settings); Table table = engine.getTable(new DataSourceSchemaName(DEFAULT_DATASOURCE_NAME, "default"), TABLE_INFO); - assertNotNull(table); - assertTrue(table instanceof OpenSearchSystemIndex); + assertAll( + () -> assertNotNull(table), + () -> assertTrue(table instanceof OpenSearchSystemIndex) + ); + } + + @Test + public void getTableScan() { + when(client.getIndexMappings(anyString())).thenReturn(Map.of()); + OpenSearchResponse response = mock(); + when(response.isEmpty()).thenReturn(true); + when(client.search(any())).thenReturn(response); + OpenSearchStorageEngine engine = new OpenSearchStorageEngine(client, settings); + var scan = engine.getTableScan("test", "test"); + assertAll( + () -> assertTrue(scan instanceof OpenSearchPagedIndexScan), + () -> { + scan.open(); + assertFalse(scan.hasNext()); + } + ); } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java index b90ca8836d5..5c125ebc65a 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java @@ -66,7 +66,6 @@ import org.opensearch.sql.opensearch.response.agg.CompositeAggregationParser; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.response.agg.SingleValueParser; -import org.opensearch.sql.opensearch.storage.OpenSearchIndexScan; import org.opensearch.sql.opensearch.storage.script.aggregation.AggregationQueryBuilder; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; @@ -527,7 +526,7 @@ private void assertEqualsAfterOptimization(LogicalPlan expected, LogicalPlan act } private Runnable withFilterPushedDown(QueryBuilder filteringCondition) { - return () -> verify(requestBuilder, times(1)).pushDown(filteringCondition); + return () -> verify(requestBuilder, times(1)).pushDownFilter(filteringCondition); } private Runnable withAggregationPushedDown( diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java similarity index 60% rename from opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScanTest.java rename to opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java index 8aec6a7d136..8cc0d468843 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java @@ -4,12 +4,14 @@ */ -package org.opensearch.sql.opensearch.storage; +package org.opensearch.sql.opensearch.storage.scan; +import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.lenient; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; @@ -21,6 +23,8 @@ import java.util.HashMap; import java.util.Map; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -43,9 +47,11 @@ import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.request.OpenSearchQueryRequest; import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchIndexScanTest { @Mock @@ -64,119 +70,163 @@ void setup() { } @Test - void queryEmptyResult() { - mockResponse(); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "test", 3, exprValueFactory)) { + void query_empty_result() { + mockResponse(client); + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("test", 3, settings, exprValueFactory))) { indexScan.open(); - assertFalse(indexScan.hasNext()); + assertAll( + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(0, indexScan.getTotalHits()) + ); } verify(client).cleanup(any()); } @Test - void queryAllResultsWithQuery() { - mockResponse(new ExprValue[]{ + void query_all_results_with_query() { + mockResponse(client, new ExprValue[]{ employee(1, "John", "IT"), employee(2, "Smith", "HR"), employee(3, "Allen", "IT")}); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "employees", 10, exprValueFactory)) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { indexScan.open(); - assertTrue(indexScan.hasNext()); - assertEquals(employee(1, "John", "IT"), indexScan.next()); + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(2, "Smith", "HR"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(3, "Allen", "IT"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), - assertFalse(indexScan.hasNext()); + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); } verify(client).cleanup(any()); } @Test - void queryAllResultsWithScroll() { - mockResponse( + void query_all_results_with_scroll() { + mockResponse(client, new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT")}); + //when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(2); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "employees", 2, exprValueFactory)) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { indexScan.open(); - assertTrue(indexScan.hasNext()); - assertEquals(employee(1, "John", "IT"), indexScan.next()); + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(2, "Smith", "HR"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(3, "Allen", "IT"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), - assertFalse(indexScan.hasNext()); + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); } verify(client).cleanup(any()); } @Test - void querySomeResultsWithQuery() { - mockResponse(new ExprValue[]{ + void query_some_results_with_query() { + mockResponse(client, new ExprValue[]{ employee(1, "John", "IT"), employee(2, "Smith", "HR"), employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "employees", 10, exprValueFactory)) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); - assertTrue(indexScan.hasNext()); - assertEquals(employee(1, "John", "IT"), indexScan.next()); + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(2, "Smith", "HR"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(3, "Allen", "IT"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), - assertFalse(indexScan.hasNext()); + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); } verify(client).cleanup(any()); } @Test - void querySomeResultsWithScroll() { - mockResponse( + void query_some_results_with_scroll() { + mockResponse(client, new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "employees", 2, exprValueFactory)) { + new OpenSearchIndexScan(client, new OpenSearchRequestBuilder("employees", 2, settings, + exprValueFactory))) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); - assertTrue(indexScan.hasNext()); - assertEquals(employee(1, "John", "IT"), indexScan.next()); + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), + + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); + } + verify(client).cleanup(any()); + } + + @Test + void query_results_limited_by_query_size() { + mockResponse(client, new ExprValue[]{ + employee(1, "John", "IT"), + employee(2, "Smith", "HR"), + employee(3, "Allen", "IT"), + employee(4, "Bob", "HR")}); + when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(2); + + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { + indexScan.open(); - assertTrue(indexScan.hasNext()); - assertEquals(employee(2, "Smith", "HR"), indexScan.next()); + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), - assertTrue(indexScan.hasNext()); - assertEquals(employee(3, "Allen", "IT"), indexScan.next()); + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), - assertFalse(indexScan.hasNext()); + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(2, indexScan.getTotalHits()) + ); } verify(client).cleanup(any()); } @Test - void pushDownFilters() { + void push_down_filters() { assertThat() .pushDown(QueryBuilders.termQuery("name", "John")) .shouldQuery(QueryBuilders.termQuery("name", "John")) @@ -194,7 +244,7 @@ void pushDownFilters() { } @Test - void pushDownHighlight() { + void push_down_highlight() { Map args = new HashMap<>(); assertThat() .pushDown(QueryBuilders.termQuery("name", "John")) @@ -205,7 +255,7 @@ void pushDownHighlight() { } @Test - void pushDownHighlightWithArguments() { + void push_down_highlight_with_arguments() { Map args = new HashMap<>(); args.put("pre_tags", new Literal("", DataType.STRING)); args.put("post_tags", new Literal("", DataType.STRING)); @@ -220,13 +270,14 @@ void pushDownHighlightWithArguments() { } @Test - void pushDownHighlightWithRepeatingFields() { - mockResponse( + void push_down_highlight_with_repeating_fields() { + mockResponse(client, new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, settings, "test", 2, exprValueFactory)) { + new OpenSearchIndexScan(client, new OpenSearchRequestBuilder("test", 2, settings, + exprValueFactory))) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); Map args = new HashMap<>(); @@ -252,14 +303,16 @@ public PushDownAssertion(OpenSearchClient client, OpenSearchExprValueFactory valueFactory, Settings settings) { this.client = client; - this.indexScan = new OpenSearchIndexScan(client, settings, "test", 10000, valueFactory); + this.indexScan = new OpenSearchIndexScan(client, + new OpenSearchRequestBuilder("test", 10000, + settings, valueFactory)); this.response = mock(OpenSearchResponse.class); this.factory = valueFactory; when(response.isEmpty()).thenReturn(true); } PushDownAssertion pushDown(QueryBuilder query) { - indexScan.getRequestBuilder().pushDown(query); + indexScan.getRequestBuilder().pushDownFilter(query); return this; } @@ -290,7 +343,7 @@ PushDownAssertion shouldQuery(QueryBuilder expected) { } } - private void mockResponse(ExprValue[]... searchHitBatches) { + public static void mockResponse(OpenSearchClient client, ExprValue[]... searchHitBatches) { when(client.search(any())) .thenAnswer( new Answer() { @@ -304,6 +357,9 @@ public OpenSearchResponse answer(InvocationOnMock invocation) { when(response.isEmpty()).thenReturn(false); ExprValue[] searchHit = searchHitBatches[batchNum]; when(response.iterator()).thenReturn(Arrays.asList(searchHit).iterator()); + // used in OpenSearchPagedIndexScanTest + lenient().when(response.getTotalHits()) + .thenReturn((long) searchHitBatches[batchNum].length); } else { when(response.isEmpty()).thenReturn(true); } @@ -314,14 +370,14 @@ public OpenSearchResponse answer(InvocationOnMock invocation) { }); } - protected ExprValue employee(int docId, String name, String department) { + public static ExprValue employee(int docId, String name, String department) { SearchHit hit = new SearchHit(docId); hit.sourceRef( new BytesArray("{\"name\":\"" + name + "\",\"department\":\"" + department + "\"}")); return tupleValue(hit); } - private ExprValue tupleValue(SearchHit hit) { + private static ExprValue tupleValue(SearchHit hit) { return ExprValueUtils.tupleValue(hit.getSourceAsMap()); } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java new file mode 100644 index 00000000000..65c0ddffc2a --- /dev/null +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java @@ -0,0 +1,164 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.opensearch.storage.scan; + +import static org.junit.jupiter.api.Assertions.assertAll; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.CALLS_REAL_METHODS; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.withSettings; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; +import static org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScanTest.employee; +import static org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScanTest.mockResponse; + +import com.google.common.collect.ImmutableMap; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.request.ContinuePageRequestBuilder; +import org.opensearch.sql.opensearch.request.InitialPageRequestBuilder; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; +import org.opensearch.sql.opensearch.request.PagedRequestBuilder; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; + +@ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class OpenSearchPagedIndexScanTest { + @Mock + private OpenSearchClient client; + + private final OpenSearchExprValueFactory exprValueFactory = new OpenSearchExprValueFactory( + ImmutableMap.of( + "name", OpenSearchDataType.of(STRING), + "department", OpenSearchDataType.of(STRING))); + + @Test + void query_empty_result() { + mockResponse(client); + InitialPageRequestBuilder builder = new InitialPageRequestBuilder( + new OpenSearchRequest.IndexName("test"), 3, exprValueFactory); + try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { + indexScan.open(); + assertFalse(indexScan.hasNext()); + } + verify(client).cleanup(any()); + } + + @Test + void query_all_results_initial_scroll_request() { + mockResponse(client, new ExprValue[]{ + employee(1, "John", "IT"), + employee(2, "Smith", "HR"), + employee(3, "Allen", "IT")}); + + PagedRequestBuilder builder = new InitialPageRequestBuilder( + new OpenSearchRequest.IndexName("test"), 3, exprValueFactory); + try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { + indexScan.open(); + + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), + + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); + } + verify(client).cleanup(any()); + + builder = new ContinuePageRequestBuilder( + new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { + indexScan.open(); + + assertFalse(indexScan.hasNext()); + } + verify(client, times(2)).cleanup(any()); + } + + @Test + void query_all_results_continuation_scroll_request() { + mockResponse(client, new ExprValue[]{ + employee(1, "John", "IT"), + employee(2, "Smith", "HR"), + employee(3, "Allen", "IT")}); + + ContinuePageRequestBuilder builder = new ContinuePageRequestBuilder( + new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { + indexScan.open(); + + assertAll( + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(1, "John", "IT"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(2, "Smith", "HR"), indexScan.next()), + + () -> assertTrue(indexScan.hasNext()), + () -> assertEquals(employee(3, "Allen", "IT"), indexScan.next()), + + () -> assertFalse(indexScan.hasNext()), + () -> assertEquals(3, indexScan.getTotalHits()) + ); + } + verify(client).cleanup(any()); + + builder = new ContinuePageRequestBuilder( + new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { + indexScan.open(); + + assertFalse(indexScan.hasNext()); + } + verify(client, times(2)).cleanup(any()); + } + + @Test + void explain_not_implemented() { + assertThrows(Throwable.class, () -> mock(OpenSearchPagedIndexScan.class, + withSettings().defaultAnswer(CALLS_REAL_METHODS)).explain()); + } + + @Test + void toCursor() { + PagedRequestBuilder builder = mock(); + OpenSearchRequest request = mock(); + OpenSearchResponse response = mock(); + when(builder.build()).thenReturn(request); + when(builder.getIndexName()).thenReturn(new OpenSearchRequest.IndexName("index")); + when(client.search(request)).thenReturn(response); + when(response.isEmpty()).thenReturn(true); + when(request.toCursor()).thenReturn("cu-cursor", "", null); + OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder); + indexScan.open(); + assertAll( + () -> assertEquals("(OpenSearchPagedIndexScan,index,cu-cursor)", indexScan.toCursor()), + () -> assertEquals("", indexScan.toCursor()), + () -> assertEquals("", indexScan.toCursor()) + ); + } +} diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java index 3d497c2f5b7..a88d81c0201 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java @@ -27,8 +27,8 @@ import org.opensearch.script.ScriptEngine; import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.Expression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.ExpressionFilterScriptFactory; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java index e771e01bce6..474aba14206 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java @@ -51,9 +51,9 @@ import org.opensearch.sql.expression.aggregation.AvgAggregator; import org.opensearch.sql.expression.aggregation.CountAggregator; import org.opensearch.sql.expression.aggregation.NamedAggregator; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java index f93c69de280..eaeacd09ef0 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java @@ -46,9 +46,9 @@ import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.parse.ParseExpression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java index 94f152f9132..d8e81026b68 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java @@ -43,7 +43,7 @@ import org.opensearch.sql.expression.aggregation.SumAggregator; import org.opensearch.sql.expression.aggregation.TakeAggregator; import org.opensearch.sql.expression.function.FunctionName; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java index 96245909a48..3b7865aa463 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java @@ -53,9 +53,9 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; -import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScanTest.java index 494f3ff2d0e..c04ef25611e 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/system/OpenSearchSystemIndexScanTest.java @@ -32,6 +32,7 @@ public void queryData() { systemIndexScan.open(); assertTrue(systemIndexScan.hasNext()); assertEquals(stringValue("text"), systemIndexScan.next()); + assertEquals(1, systemIndexScan.getTotalHits()); } @Test diff --git a/plugin/build.gradle b/plugin/build.gradle index 1c5b4366f07..cb9ab64d7be 100644 --- a/plugin/build.gradle +++ b/plugin/build.gradle @@ -248,6 +248,7 @@ afterEvaluate { testClusters.integTest { plugin(project.tasks.bundlePlugin.archiveFile) + testDistribution = "ARCHIVE" // debug with command, ./gradlew opensearch-sql:run -DdebugJVM. --debug-jvm does not work with keystore. if (System.getProperty("debugJVM") != null) { diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java index 3d733233be5..1439ed0e25f 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java @@ -51,6 +51,7 @@ import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.datasource.DataSourceServiceImpl; import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.legacy.esdomain.LocalClusterState; import org.opensearch.sql.legacy.executor.AsyncRestExecutor; import org.opensearch.sql.legacy.metrics.Metrics; @@ -61,7 +62,6 @@ import org.opensearch.sql.opensearch.setting.OpenSearchSettings; import org.opensearch.sql.opensearch.storage.OpenSearchDataSourceFactory; import org.opensearch.sql.opensearch.storage.script.ExpressionScriptEngine; -import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.plugin.config.OpenSearchPluginModule; import org.opensearch.sql.plugin.datasource.DataSourceSettings; import org.opensearch.sql.plugin.datasource.DataSourceUserAuthorizationHelperImpl; diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java index 5ab4bbaecd0..b0c698a0cf2 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java @@ -18,6 +18,7 @@ import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; +import org.opensearch.sql.executor.pagination.PaginatedPlanCache; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; import org.opensearch.sql.monitor.ResourceMonitor; import org.opensearch.sql.opensearch.client.OpenSearchClient; @@ -58,8 +59,9 @@ public StorageEngine storageEngine(OpenSearchClient client, Settings settings) { } @Provides - public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector) { - return new OpenSearchExecutionEngine(client, protector); + public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, + PaginatedPlanCache paginatedPlanCache) { + return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); } @Provides @@ -72,6 +74,11 @@ public ExecutionProtector protector(ResourceMonitor resourceMonitor) { return new OpenSearchExecutionProtector(resourceMonitor); } + @Provides + public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { + return new PaginatedPlanCache(storageEngine); + } + @Provides @Singleton public QueryManager queryManager(NodeClient nodeClient) { @@ -92,12 +99,16 @@ public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPl * {@link QueryPlanFactory}. */ @Provides - public QueryPlanFactory queryPlanFactory( - DataSourceService dataSourceService, ExecutionEngine executionEngine) { + public QueryPlanFactory queryPlanFactory(DataSourceService dataSourceService, + ExecutionEngine executionEngine, + PaginatedPlanCache paginatedPlanCache) { Analyzer analyzer = new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); - return new QueryPlanFactory(new QueryService(analyzer, executionEngine, planner)); + Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); + QueryService queryService = new QueryService( + analyzer, executionEngine, planner, paginationPlanner); + return new QueryPlanFactory(queryService, paginatedPlanCache); } } diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java b/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java index 6825b2ac923..a67e077ecc5 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java @@ -139,7 +139,8 @@ private ResponseListener createListener( @Override public void onResponse(ExecutionEngine.QueryResponse response) { String responseContent = - formatter.format(new QueryResult(response.getSchema(), response.getResults())); + formatter.format(new QueryResult(response.getSchema(), response.getResults(), + response.getCursor(), response.getTotal())); listener.onResponse(new TransportPPLQueryResponse(responseContent)); } diff --git a/ppl/src/main/java/org/opensearch/sql/ppl/PPLService.java b/ppl/src/main/java/org/opensearch/sql/ppl/PPLService.java index e11edc16465..f91ac7222f5 100644 --- a/ppl/src/main/java/org/opensearch/sql/ppl/PPLService.java +++ b/ppl/src/main/java/org/opensearch/sql/ppl/PPLService.java @@ -90,6 +90,7 @@ private AbstractPlan plan( QueryContext.getRequestId(), anonymizer.anonymizeStatement(statement)); - return queryExecutionFactory.create(statement, queryListener, explainListener); + return queryExecutionFactory.createContinuePaginatedPlan( + statement, queryListener, explainListener); } } diff --git a/ppl/src/main/java/org/opensearch/sql/ppl/parser/AstStatementBuilder.java b/ppl/src/main/java/org/opensearch/sql/ppl/parser/AstStatementBuilder.java index e4f40e9a115..3b7e5a78dde 100644 --- a/ppl/src/main/java/org/opensearch/sql/ppl/parser/AstStatementBuilder.java +++ b/ppl/src/main/java/org/opensearch/sql/ppl/parser/AstStatementBuilder.java @@ -33,7 +33,7 @@ public class AstStatementBuilder extends OpenSearchPPLParserBaseVisitor { ResponseListener listener = invocation.getArgument(1); - listener.onResponse(new QueryResponse(schema, Collections.emptyList())); + listener.onResponse(new QueryResponse(schema, Collections.emptyList(), 0, Cursor.None)); return null; }).when(queryService).execute(any(), any()); @@ -87,7 +93,7 @@ public void onFailure(Exception e) { public void testExecuteCsvFormatShouldPass() { doAnswer(invocation -> { ResponseListener listener = invocation.getArgument(1); - listener.onResponse(new QueryResponse(schema, Collections.emptyList())); + listener.onResponse(new QueryResponse(schema, Collections.emptyList(), 0, Cursor.None)); return null; }).when(queryService).execute(any(), any()); @@ -161,7 +167,7 @@ public void onFailure(Exception e) { public void testPrometheusQuery() { doAnswer(invocation -> { ResponseListener listener = invocation.getArgument(1); - listener.onResponse(new QueryResponse(schema, Collections.emptyList())); + listener.onResponse(new QueryResponse(schema, Collections.emptyList(), 0, Cursor.None)); return null; }).when(queryService).execute(any(), any()); diff --git a/ppl/src/test/java/org/opensearch/sql/ppl/parser/AstStatementBuilderTest.java b/ppl/src/test/java/org/opensearch/sql/ppl/parser/AstStatementBuilderTest.java index 47600246920..de74e4932f9 100644 --- a/ppl/src/test/java/org/opensearch/sql/ppl/parser/AstStatementBuilderTest.java +++ b/ppl/src/test/java/org/opensearch/sql/ppl/parser/AstStatementBuilderTest.java @@ -39,7 +39,8 @@ public void buildQueryStatement() { "search source=t a=1", new Query( project( - filter(relation("t"), compare("=", field("a"), intLiteral(1))), AllFields.of()))); + filter(relation("t"), compare("=", field("a"), + intLiteral(1))), AllFields.of()), 0)); } @Test @@ -50,7 +51,7 @@ public void buildExplainStatement() { new Query( project( filter(relation("t"), compare("=", field("a"), intLiteral(1))), - AllFields.of())))); + AllFields.of()), 0))); } private void assertEqual(String query, Statement expectedStatement) { diff --git a/protocol/src/main/java/org/opensearch/sql/protocol/response/QueryResult.java b/protocol/src/main/java/org/opensearch/sql/protocol/response/QueryResult.java index 915a61f3611..d06dba7719f 100644 --- a/protocol/src/main/java/org/opensearch/sql/protocol/response/QueryResult.java +++ b/protocol/src/main/java/org/opensearch/sql/protocol/response/QueryResult.java @@ -16,6 +16,7 @@ import org.opensearch.sql.data.model.ExprValueUtils; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.ExecutionEngine.Schema.Column; +import org.opensearch.sql.executor.pagination.Cursor; /** * Query response that encapsulates query results and isolate {@link ExprValue} @@ -32,6 +33,16 @@ public class QueryResult implements Iterable { */ private final Collection exprValues; + @Getter + private final Cursor cursor; + + @Getter + private final long total; + + + public QueryResult(ExecutionEngine.Schema schema, Collection exprValues) { + this(schema, exprValues, Cursor.None, exprValues.size()); + } /** * size of results. diff --git a/protocol/src/main/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatter.java b/protocol/src/main/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatter.java index 943287cb62b..b9a2d2fcc64 100644 --- a/protocol/src/main/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatter.java +++ b/protocol/src/main/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatter.java @@ -15,6 +15,7 @@ import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.exception.QueryEngineException; import org.opensearch.sql.executor.ExecutionEngine.Schema; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.opensearch.response.error.ErrorMessage; import org.opensearch.sql.opensearch.response.error.ErrorMessageFactory; import org.opensearch.sql.protocol.response.QueryResult; @@ -39,9 +40,12 @@ protected Object buildJsonObject(QueryResult response) { json.datarows(fetchDataRows(response)); // Populate other fields - json.total(response.size()) + json.total(response.getTotal()) .size(response.size()) .status(200); + if (!response.getCursor().equals(Cursor.None)) { + json.cursor(response.getCursor().toString()); + } return json.build(); } @@ -95,6 +99,8 @@ public static class JdbcResponse { private final long total; private final long size; private final int status; + + private final String cursor; } @RequiredArgsConstructor diff --git a/protocol/src/test/java/org/opensearch/sql/protocol/response/QueryResultTest.java b/protocol/src/test/java/org/opensearch/sql/protocol/response/QueryResultTest.java index 319965e2d0e..470bb205a80 100644 --- a/protocol/src/test/java/org/opensearch/sql/protocol/response/QueryResultTest.java +++ b/protocol/src/test/java/org/opensearch/sql/protocol/response/QueryResultTest.java @@ -19,6 +19,7 @@ import java.util.Collections; import org.junit.jupiter.api.Test; import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.pagination.Cursor; class QueryResultTest { @@ -35,7 +36,7 @@ void size() { tupleValue(ImmutableMap.of("name", "John", "age", 20)), tupleValue(ImmutableMap.of("name", "Allen", "age", 30)), tupleValue(ImmutableMap.of("name", "Smith", "age", 40)) - )); + ), Cursor.None, 0); assertEquals(3, response.size()); } @@ -45,7 +46,7 @@ void columnNameTypes() { schema, Collections.singletonList( tupleValue(ImmutableMap.of("name", "John", "age", 20)) - )); + ), Cursor.None, 0); assertEquals( ImmutableMap.of("name", "string", "age", "integer"), @@ -59,7 +60,8 @@ void columnNameTypesWithAlias() { new ExecutionEngine.Schema.Column("name", "n", STRING))); QueryResult response = new QueryResult( schema, - Collections.singletonList(tupleValue(ImmutableMap.of("n", "John")))); + Collections.singletonList(tupleValue(ImmutableMap.of("n", "John"))), + Cursor.None, 0); assertEquals( ImmutableMap.of("n", "string"), @@ -71,7 +73,7 @@ void columnNameTypesWithAlias() { void columnNameTypesFromEmptyExprValues() { QueryResult response = new QueryResult( schema, - Collections.emptyList()); + Collections.emptyList(), Cursor.None, 0); assertEquals( ImmutableMap.of("name", "string", "age", "integer"), response.columnNameTypes() @@ -100,7 +102,7 @@ void iterate() { Arrays.asList( tupleValue(ImmutableMap.of("name", "John", "age", 20)), tupleValue(ImmutableMap.of("name", "Allen", "age", 30)) - )); + ), Cursor.None, 0); int i = 0; for (Object[] objects : response) { diff --git a/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java b/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java index a6671c66f8c..b5cb5984a17 100644 --- a/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java +++ b/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java @@ -31,6 +31,7 @@ import org.opensearch.sql.common.antlr.SyntaxCheckException; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.exception.SemanticCheckException; +import org.opensearch.sql.executor.pagination.Cursor; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; import org.opensearch.sql.protocol.response.QueryResult; @@ -83,6 +84,37 @@ void format_response() { formatter.format(response)); } + @Test + void format_response_with_cursor() { + QueryResult response = new QueryResult( + new Schema(ImmutableList.of( + new Column("name", "name", STRING), + new Column("address", "address", OpenSearchTextType.of()), + new Column("age", "age", INTEGER))), + ImmutableList.of( + tupleValue(ImmutableMap.builder() + .put("name", "John") + .put("address", "Seattle") + .put("age", 20) + .build())), + new Cursor("test_cursor".getBytes()), 42); + + assertJsonEquals( + "{" + + "\"schema\":[" + + "{\"name\":\"name\",\"alias\":\"name\",\"type\":\"keyword\"}," + + "{\"name\":\"address\",\"alias\":\"address\",\"type\":\"text\"}," + + "{\"name\":\"age\",\"alias\":\"age\",\"type\":\"integer\"}" + + "]," + + "\"datarows\":[" + + "[\"John\",\"Seattle\",20]]," + + "\"total\":42," + + "\"size\":1," + + "\"cursor\":\"test_cursor\"," + + "\"status\":200}", + formatter.format(response)); + } + @Test void format_response_with_missing_and_null_value() { QueryResult response = diff --git a/sql/src/main/java/org/opensearch/sql/sql/SQLService.java b/sql/src/main/java/org/opensearch/sql/sql/SQLService.java index 082a3e95816..4ecf9e699be 100644 --- a/sql/src/main/java/org/opensearch/sql/sql/SQLService.java +++ b/sql/src/main/java/org/opensearch/sql/sql/SQLService.java @@ -65,16 +65,24 @@ private AbstractPlan plan( SQLQueryRequest request, Optional> queryListener, Optional> explainListener) { - // 1.Parse query and convert parse tree (CST) to abstract syntax tree (AST) - ParseTree cst = parser.parse(request.getQuery()); - Statement statement = - cst.accept( - new AstStatementBuilder( - new AstBuilder(request.getQuery()), - AstStatementBuilder.StatementBuilderContext.builder() - .isExplain(request.isExplainRequest()) - .build())); + if (request.getCursor().isPresent()) { + // Handle v2 cursor here -- legacy cursor was handled earlier. + return queryExecutionFactory.createContinuePaginatedPlan(request.getCursor().get(), + request.isExplainRequest(), queryListener.orElse(null), explainListener.orElse(null)); + } else { + // 1.Parse query and convert parse tree (CST) to abstract syntax tree (AST) + ParseTree cst = parser.parse(request.getQuery()); + Statement statement = + cst.accept( + new AstStatementBuilder( + new AstBuilder(request.getQuery()), + AstStatementBuilder.StatementBuilderContext.builder() + .isExplain(request.isExplainRequest()) + .fetchSize(request.getFetchSize()) + .build())); - return queryExecutionFactory.create(statement, queryListener, explainListener); + return queryExecutionFactory.createContinuePaginatedPlan( + statement, queryListener, explainListener); + } } } diff --git a/sql/src/main/java/org/opensearch/sql/sql/domain/SQLQueryRequest.java b/sql/src/main/java/org/opensearch/sql/sql/domain/SQLQueryRequest.java index 508f80cee41..7545f4cc19f 100644 --- a/sql/src/main/java/org/opensearch/sql/sql/domain/SQLQueryRequest.java +++ b/sql/src/main/java/org/opensearch/sql/sql/domain/SQLQueryRequest.java @@ -6,13 +6,12 @@ package org.opensearch.sql.sql.domain; -import com.google.common.base.Strings; -import com.google.common.collect.ImmutableSet; import java.util.Collections; import java.util.Locale; import java.util.Map; import java.util.Optional; import java.util.Set; +import java.util.stream.Stream; import lombok.EqualsAndHashCode; import lombok.Getter; import lombok.RequiredArgsConstructor; @@ -28,9 +27,9 @@ @EqualsAndHashCode @RequiredArgsConstructor public class SQLQueryRequest { - - private static final Set SUPPORTED_FIELDS = ImmutableSet.of( - "query", "fetch_size", "parameters"); + private static final String QUERY_FIELD_CURSOR = "cursor"; + private static final Set SUPPORTED_FIELDS = Set.of( + "query", "fetch_size", "parameters", QUERY_FIELD_CURSOR); private static final String QUERY_PARAMS_FORMAT = "format"; private static final String QUERY_PARAMS_SANITIZE = "sanitize"; @@ -64,36 +63,50 @@ public class SQLQueryRequest { @Accessors(fluent = true) private boolean sanitize = true; + private String cursor; + /** * Constructor of SQLQueryRequest that passes request params. */ - public SQLQueryRequest( - JSONObject jsonContent, String query, String path, Map params) { + public SQLQueryRequest(JSONObject jsonContent, String query, String path, + Map params, String cursor) { this.jsonContent = jsonContent; this.query = query; this.path = path; this.params = params; this.format = getFormat(params); this.sanitize = shouldSanitize(params); + this.cursor = cursor; } /** * Pre-check if the request can be supported by meeting ALL the following criteria: * 1.Only supported fields present in request body, ex. "filter" and "cursor" are not supported - * 2.No fetch_size or "fetch_size=0". In other word, it's not a cursor request - * 3.Response format is default or can be supported. + * 2.Response format is default or can be supported. * - * @return true if supported. + * @return true if supported. */ public boolean isSupported() { - return isOnlySupportedFieldInPayload() - && isFetchSizeZeroIfPresent() - && isSupportedFormat(); + var noCursor = !isCursor(); + var noQuery = query == null; + var noUnsupportedParams = params.isEmpty() + || (params.size() == 1 && params.containsKey(QUERY_PARAMS_FORMAT)); + var noContent = jsonContent == null || jsonContent.isEmpty(); + + return ((!noCursor && noQuery + && noUnsupportedParams && noContent) // if cursor is given, but other things + || (noCursor && !noQuery)) // or if cursor is not given, but query + && isOnlySupportedFieldInPayload() // and request has supported fields only + && isSupportedFormat(); // and request is in supported format + } + + private boolean isCursor() { + return cursor != null && !cursor.isEmpty(); } /** * Check if request is to explain rather than execute the query. - * @return true if it is a explain request + * @return true if it is an explain request */ public boolean isExplainRequest() { return path.endsWith("/_explain"); @@ -113,23 +126,23 @@ public Format format() { } private boolean isOnlySupportedFieldInPayload() { - return SUPPORTED_FIELDS.containsAll(jsonContent.keySet()); + return jsonContent == null || SUPPORTED_FIELDS.containsAll(jsonContent.keySet()); } - private boolean isFetchSizeZeroIfPresent() { - return (jsonContent.optInt("fetch_size") == 0); + public Optional getCursor() { + return Optional.ofNullable(cursor); + } + + public int getFetchSize() { + return jsonContent.optInt("fetch_size"); } private boolean isSupportedFormat() { - return Strings.isNullOrEmpty(format) || "jdbc".equalsIgnoreCase(format) - || "csv".equalsIgnoreCase(format) || "raw".equalsIgnoreCase(format); + return Stream.of("csv", "jdbc", "raw").anyMatch(format::equalsIgnoreCase); } private String getFormat(Map params) { - if (params.containsKey(QUERY_PARAMS_FORMAT)) { - return params.get(QUERY_PARAMS_FORMAT); - } - return "jdbc"; + return params.getOrDefault(QUERY_PARAMS_FORMAT, "jdbc"); } private boolean shouldSanitize(Map params) { diff --git a/sql/src/main/java/org/opensearch/sql/sql/parser/AstStatementBuilder.java b/sql/src/main/java/org/opensearch/sql/sql/parser/AstStatementBuilder.java index 40d549764a6..593e7b51ff4 100644 --- a/sql/src/main/java/org/opensearch/sql/sql/parser/AstStatementBuilder.java +++ b/sql/src/main/java/org/opensearch/sql/sql/parser/AstStatementBuilder.java @@ -26,7 +26,7 @@ public class AstStatementBuilder extends OpenSearchSQLParserBaseVisitor { - ResponseListener listener = invocation.getArgument(1); - listener.onResponse(new QueryResponse(schema, Collections.emptyList())); - return null; - }).when(queryService).execute(any(), any()); - + public void can_execute_sql_query() { sqlService.execute( new SQLQueryRequest(new JSONObject(), "SELECT 123", QUERY, "jdbc"), - new ResponseListener() { + new ResponseListener<>() { @Override public void onResponse(QueryResponse response) { assertNotNull(response); @@ -84,13 +82,24 @@ public void onFailure(Exception e) { } @Test - public void canExecuteCsvFormatRequest() { - doAnswer(invocation -> { - ResponseListener listener = invocation.getArgument(1); - listener.onResponse(new QueryResponse(schema, Collections.emptyList())); - return null; - }).when(queryService).execute(any(), any()); + public void can_execute_cursor_query() { + sqlService.execute( + new SQLQueryRequest(new JSONObject(), null, QUERY, Map.of("format", "jdbc"), "n:cursor"), + new ResponseListener<>() { + @Override + public void onResponse(QueryResponse response) { + assertNotNull(response); + } + + @Override + public void onFailure(Exception e) { + fail(e); + } + }); + } + @Test + public void can_execute_csv_format_request() { sqlService.execute( new SQLQueryRequest(new JSONObject(), "SELECT 123", QUERY, "csv"), new ResponseListener() { @@ -107,7 +116,7 @@ public void onFailure(Exception e) { } @Test - public void canExplainSqlQuery() { + public void can_explain_sql_query() { doAnswer(invocation -> { ResponseListener listener = invocation.getArgument(1); listener.onResponse(new ExplainResponse(new ExplainResponseNode("Test"))); @@ -129,7 +138,25 @@ public void onFailure(Exception e) { } @Test - public void canCaptureErrorDuringExecution() { + public void cannot_explain_cursor_query() { + sqlService.explain(new SQLQueryRequest(new JSONObject(), null, EXPLAIN, + Map.of("format", "jdbc"), "n:cursor"), + new ResponseListener() { + @Override + public void onResponse(ExplainResponse response) { + fail(response.toString()); + } + + @Override + public void onFailure(Exception e) { + assertTrue(e.getMessage() + .contains("`explain` request for cursor requests is not supported.")); + } + }); + } + + @Test + public void can_capture_error_during_execution() { sqlService.execute( new SQLQueryRequest(new JSONObject(), "SELECT", QUERY, ""), new ResponseListener() { @@ -146,7 +173,7 @@ public void onFailure(Exception e) { } @Test - public void canCaptureErrorDuringExplain() { + public void can_capture_error_during_explain() { sqlService.explain( new SQLQueryRequest(new JSONObject(), "SELECT", EXPLAIN, ""), new ResponseListener() { @@ -161,5 +188,4 @@ public void onFailure(Exception e) { } }); } - } diff --git a/sql/src/test/java/org/opensearch/sql/sql/domain/SQLQueryRequestTest.java b/sql/src/test/java/org/opensearch/sql/sql/domain/SQLQueryRequestTest.java index 52a1f534e9e..62bb665537c 100644 --- a/sql/src/test/java/org/opensearch/sql/sql/domain/SQLQueryRequestTest.java +++ b/sql/src/test/java/org/opensearch/sql/sql/domain/SQLQueryRequestTest.java @@ -6,36 +6,43 @@ package org.opensearch.sql.sql.domain; +import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.junit.jupiter.api.Assertions.assertTrue; import com.google.common.collect.ImmutableMap; +import java.util.HashMap; import java.util.Map; import org.json.JSONObject; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.opensearch.sql.protocol.response.format.Format; +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class SQLQueryRequestTest { @Test - public void shouldSupportQuery() { + public void should_support_query() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1").build(); assertTrue(request.isSupported()); } @Test - public void shouldSupportQueryWithJDBCFormat() { + public void should_support_query_with_JDBC_format() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1") .format("jdbc") .build(); - assertTrue(request.isSupported()); - assertEquals(request.format(), Format.JDBC); + assertAll( + () -> assertTrue(request.isSupported()), + () -> assertEquals(request.format(), Format.JDBC) + ); } @Test - public void shouldSupportQueryWithQueryFieldOnly() { + public void should_support_query_with_query_field_only() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1") .jsonContent("{\"query\": \"SELECT 1\"}") @@ -44,16 +51,32 @@ public void shouldSupportQueryWithQueryFieldOnly() { } @Test - public void shouldSupportQueryWithParameters() { - SQLQueryRequest request = + public void should_support_query_with_parameters() { + SQLQueryRequest requestWithContent = SQLQueryRequestBuilder.request("SELECT 1") .jsonContent("{\"query\": \"SELECT 1\", \"parameters\":[]}") .build(); - assertTrue(request.isSupported()); + SQLQueryRequest requestWithParams = + SQLQueryRequestBuilder.request("SELECT 1") + .params(Map.of("one", "two")) + .build(); + assertAll( + () -> assertTrue(requestWithContent.isSupported()), + () -> assertTrue(requestWithParams.isSupported()) + ); + } + + @Test + public void should_support_query_without_parameters() { + SQLQueryRequest requestWithNoParams = + SQLQueryRequestBuilder.request("SELECT 1") + .params(Map.of()) + .build(); + assertTrue(requestWithNoParams.isSupported()); } @Test - public void shouldSupportQueryWithZeroFetchSize() { + public void should_support_query_with_zero_fetch_size() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1") .jsonContent("{\"query\": \"SELECT 1\", \"fetch_size\": 0}") @@ -62,7 +85,7 @@ public void shouldSupportQueryWithZeroFetchSize() { } @Test - public void shouldSupportQueryWithParametersAndZeroFetchSize() { + public void should_support_query_with_parameters_and_zero_fetch_size() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1") .jsonContent("{\"query\": \"SELECT 1\", \"fetch_size\": 0, \"parameters\":[]}") @@ -71,70 +94,155 @@ public void shouldSupportQueryWithParametersAndZeroFetchSize() { } @Test - public void shouldSupportExplain() { + public void should_support_explain() { SQLQueryRequest explainRequest = SQLQueryRequestBuilder.request("SELECT 1") .path("_plugins/_sql/_explain") .build(); - assertTrue(explainRequest.isExplainRequest()); - assertTrue(explainRequest.isSupported()); + + assertAll( + () -> assertTrue(explainRequest.isExplainRequest()), + () -> assertTrue(explainRequest.isSupported()) + ); } @Test - public void shouldNotSupportCursorRequest() { + public void should_support_cursor_request() { SQLQueryRequest fetchSizeRequest = SQLQueryRequestBuilder.request("SELECT 1") .jsonContent("{\"query\": \"SELECT 1\", \"fetch_size\": 5}") .build(); - assertFalse(fetchSizeRequest.isSupported()); SQLQueryRequest cursorRequest = + SQLQueryRequestBuilder.request(null) + .cursor("abcdefgh...") + .build(); + + assertAll( + () -> assertTrue(fetchSizeRequest.isSupported()), + () -> assertTrue(cursorRequest.isSupported()) + ); + } + + @Test + public void should_not_support_request_with_empty_cursor() { + SQLQueryRequest requestWithEmptyCursor = + SQLQueryRequestBuilder.request(null) + .cursor("") + .build(); + SQLQueryRequest requestWithNullCursor = + SQLQueryRequestBuilder.request(null) + .cursor(null) + .build(); + assertAll( + () -> assertFalse(requestWithEmptyCursor.isSupported()), + () -> assertFalse(requestWithNullCursor.isSupported()) + ); + } + + @Test + public void should_not_support_request_with_unknown_field() { + SQLQueryRequest request = + SQLQueryRequestBuilder.request("SELECT 1") + .jsonContent("{\"pewpew\": 42}") + .build(); + assertFalse(request.isSupported()); + } + + @Test + public void should_not_support_request_with_cursor_and_something_else() { + SQLQueryRequest requestWithQuery = SQLQueryRequestBuilder.request("SELECT 1") - .jsonContent("{\"cursor\": \"abcdefgh...\"}") + .cursor("n:12356") + .build(); + SQLQueryRequest requestWithParams = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .params(Map.of("one", "two")) + .build(); + SQLQueryRequest requestWithParamsWithFormat = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .params(Map.of("format", "jdbc")) .build(); - assertFalse(cursorRequest.isSupported()); + SQLQueryRequest requestWithParamsWithFormatAnd = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .params(Map.of("format", "jdbc", "something", "else")) + .build(); + SQLQueryRequest requestWithFetchSize = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .jsonContent("{\"fetch_size\": 5}") + .build(); + SQLQueryRequest requestWithNoParams = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .params(Map.of()) + .build(); + SQLQueryRequest requestWithNoContent = + SQLQueryRequestBuilder.request(null) + .cursor("n:12356") + .jsonContent("{}") + .build(); + assertAll( + () -> assertFalse(requestWithQuery.isSupported()), + () -> assertFalse(requestWithParams.isSupported()), + () -> assertFalse(requestWithFetchSize.isSupported()), + () -> assertTrue(requestWithNoParams.isSupported()), + () -> assertTrue(requestWithParamsWithFormat.isSupported()), + () -> assertFalse(requestWithParamsWithFormatAnd.isSupported()), + () -> assertTrue(requestWithNoContent.isSupported()) + ); } @Test - public void shouldUseJDBCFormatByDefault() { + public void should_use_JDBC_format_by_default() { SQLQueryRequest request = SQLQueryRequestBuilder.request("SELECT 1").params(ImmutableMap.of()).build(); assertEquals(request.format(), Format.JDBC); } @Test - public void shouldSupportCSVFormatAndSanitize() { + public void should_support_CSV_format_and_sanitize() { SQLQueryRequest csvRequest = SQLQueryRequestBuilder.request("SELECT 1") .format("csv") .build(); - assertTrue(csvRequest.isSupported()); - assertEquals(csvRequest.format(), Format.CSV); - assertTrue(csvRequest.sanitize()); + assertAll( + () -> assertTrue(csvRequest.isSupported()), + () -> assertEquals(csvRequest.format(), Format.CSV), + () -> assertTrue(csvRequest.sanitize()) + ); } @Test - public void shouldSkipSanitizeIfSetFalse() { + public void should_skip_sanitize_if_set_false() { ImmutableMap.Builder builder = ImmutableMap.builder(); Map params = builder.put("format", "csv").put("sanitize", "false").build(); SQLQueryRequest csvRequest = SQLQueryRequestBuilder.request("SELECT 1").params(params).build(); - assertEquals(csvRequest.format(), Format.CSV); - assertFalse(csvRequest.sanitize()); + assertAll( + () -> assertEquals(csvRequest.format(), Format.CSV), + () -> assertFalse(csvRequest.sanitize()) + ); } @Test - public void shouldNotSupportOtherFormat() { + public void should_not_support_other_format() { SQLQueryRequest csvRequest = SQLQueryRequestBuilder.request("SELECT 1") .format("other") .build(); - assertFalse(csvRequest.isSupported()); - assertThrows(IllegalArgumentException.class, csvRequest::format, - "response in other format is not supported."); + + assertAll( + () -> assertFalse(csvRequest.isSupported()), + () -> assertEquals("response in other format is not supported.", + assertThrows(IllegalArgumentException.class, csvRequest::format).getMessage()) + ); } @Test - public void shouldSupportRawFormat() { + public void should_support_raw_format() { SQLQueryRequest csvRequest = SQLQueryRequestBuilder.request("SELECT 1") .format("raw") @@ -150,7 +258,8 @@ private static class SQLQueryRequestBuilder { private String query; private String path = "_plugins/_sql"; private String format; - private Map params; + private String cursor; + private Map params = new HashMap<>(); static SQLQueryRequestBuilder request(String query) { SQLQueryRequestBuilder builder = new SQLQueryRequestBuilder(); @@ -178,14 +287,17 @@ SQLQueryRequestBuilder params(Map params) { return this; } + SQLQueryRequestBuilder cursor(String cursor) { + this.cursor = cursor; + return this; + } + SQLQueryRequest build() { - if (jsonContent == null) { - jsonContent = "{\"query\": \"" + query + "\"}"; - } - if (params != null) { - return new SQLQueryRequest(new JSONObject(jsonContent), query, path, params); + if (format != null) { + params.put("format", format); } - return new SQLQueryRequest(new JSONObject(jsonContent), query, path, format); + return new SQLQueryRequest(jsonContent == null ? null : new JSONObject(jsonContent), + query, path, params, cursor); } } From 4948ac4437560ebc2dfa8d31b642d93ed16b31b4 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Fri, 31 Mar 2023 18:27:12 -0700 Subject: [PATCH 02/17] Make scroll timeout configurable. Signed-off-by: Yury-Fridlyand --- .../org/opensearch/sql/sql/PaginationIT.java | 31 +++++++++++++++++++ .../request/ContinuePageRequest.java | 18 +++++------ .../request/ContinuePageRequestBuilder.java | 18 +++++++++-- .../request/InitialPageRequestBuilder.java | 8 +++-- .../request/OpenSearchRequestBuilder.java | 17 +++++++--- .../request/OpenSearchScrollRequest.java | 21 ++++--------- .../opensearch/storage/OpenSearchIndex.java | 2 +- .../storage/OpenSearchStorageEngine.java | 2 +- .../client/OpenSearchNodeClientTest.java | 19 +++++++++--- .../client/OpenSearchRestClientTest.java | 27 ++++++++++++---- .../OpenSearchExecutionEngineTest.java | 5 +++ .../OpenSearchExecutionProtectorTest.java | 8 +++-- .../ContinuePageRequestBuilderTest.java | 13 ++++++-- .../request/ContinuePageRequestTest.java | 6 ++-- .../InitialPageRequestBuilderTest.java | 14 +++++++-- .../request/OpenSearchRequestBuilderTest.java | 4 ++- .../request/OpenSearchScrollRequestTest.java | 10 +++--- .../storage/OpenSearchIndexTest.java | 13 +++++++- .../storage/scan/OpenSearchIndexScanTest.java | 3 ++ .../scan/OpenSearchPagedIndexScanTest.java | 10 +++--- 20 files changed, 181 insertions(+), 68 deletions(-) diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java index b9e32cb1cdd..a1d353cde8f 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/PaginationIT.java @@ -10,7 +10,10 @@ import java.io.IOException; import org.json.JSONObject; +import org.junit.Ignore; import org.junit.Test; +import org.opensearch.client.ResponseException; +import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.legacy.SQLIntegTestCase; import org.opensearch.sql.util.TestUtils; @@ -45,4 +48,32 @@ public void testLargeDataSetV2() throws IOException { assertEquals(4, response.getInt("size")); TestUtils.verifyIsV2Cursor(response); } + + @Ignore("Scroll may not expire after timeout") + // Scroll keep alive parameter guarantees that scroll context would be kept for that time, + // but doesn't define how fast it will be expired after time out. + // With KA = 1s scroll may be kept up to 30 sec or more. We can't test exact expiration. + // I disable the test to prevent it waiting for a minute and delay all CI. + public void testCursorTimeout() throws IOException, InterruptedException { + updateClusterSettings( + new ClusterSetting(PERSISTENT, Settings.Key.SQL_CURSOR_KEEP_ALIVE.getKeyValue(), "1s")); + + var query = "SELECT * from " + TEST_INDEX_CALCS; + var response = new JSONObject(executeFetchQuery(query, 4, "jdbc")); + assertTrue(response.has("cursor")); + var cursor = response.getString("cursor"); + Thread.sleep(2222L); // > 1s + + ResponseException exception = + expectThrows(ResponseException.class, () -> executeCursorQuery(cursor)); + response = new JSONObject(TestUtils.getResponseBody(exception.getResponse())); + assertEquals(response.getJSONObject("error").getString("reason"), + "Error occurred in OpenSearch engine: all shards failed"); + assertTrue(response.getJSONObject("error").getString("details") + .contains("SearchContextMissingException[No search context found for id")); + assertEquals(response.getJSONObject("error").getString("type"), + "SearchPhaseExecutionException"); + + wipeAllClusterSettings(); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java index 6c81b9aca24..1ad62076823 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java @@ -5,16 +5,16 @@ package org.opensearch.sql.opensearch.request; -import static org.opensearch.sql.opensearch.request.OpenSearchScrollRequest.DEFAULT_SCROLL_TIMEOUT; - import java.util.function.Consumer; import java.util.function.Function; import lombok.EqualsAndHashCode; import lombok.Getter; +import lombok.RequiredArgsConstructor; import lombok.ToString; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchScrollRequest; +import org.opensearch.common.unit.TimeValue; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -26,11 +26,12 @@ * First (initial) request is handled by {@link InitialPageRequestBuilder}. */ @EqualsAndHashCode +@RequiredArgsConstructor public class ContinuePageRequest implements OpenSearchRequest { - final String initialScrollId; - + private final String initialScrollId; + private final TimeValue scrollTimeout; // ScrollId that OpenSearch returns after search. - String responseScrollId; + private String responseScrollId; @EqualsAndHashCode.Exclude @ToString.Exclude @@ -40,16 +41,11 @@ public class ContinuePageRequest implements OpenSearchRequest { @EqualsAndHashCode.Exclude private boolean scrollFinished = false; - public ContinuePageRequest(String scrollId, OpenSearchExprValueFactory exprValueFactory) { - this.initialScrollId = scrollId; - this.exprValueFactory = exprValueFactory; - } - @Override public OpenSearchResponse search(Function searchAction, Function scrollAction) { SearchResponse openSearchResponse = scrollAction.apply(new SearchScrollRequest(initialScrollId) - .scroll(DEFAULT_SCROLL_TIMEOUT)); + .scroll(scrollTimeout)); // TODO if terminated_early - something went wrong, e.g. no scroll returned. var response = new OpenSearchResponse(openSearchResponse, exprValueFactory); diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java index 78288c12423..a0c19c1d0ab 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java @@ -6,23 +6,35 @@ package org.opensearch.sql.opensearch.request; import lombok.Getter; -import lombok.RequiredArgsConstructor; +import org.opensearch.common.unit.TimeValue; +import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; /** * Builds a {@link ContinuePageRequest} to handle subsequent pagination/scroll/cursor requests. * Initial search requests is handled by {@link InitialPageRequestBuilder}. */ -@RequiredArgsConstructor public class ContinuePageRequestBuilder extends PagedRequestBuilder { @Getter private final OpenSearchRequest.IndexName indexName; private final String scrollId; + private final TimeValue scrollTimeout; private final OpenSearchExprValueFactory exprValueFactory; + /** Constructor. */ + public ContinuePageRequestBuilder(OpenSearchRequest.IndexName indexName, + String scrollId, + Settings settings, + OpenSearchExprValueFactory exprValueFactory) { + this.indexName = indexName; + this.scrollId = scrollId; + this.scrollTimeout = settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE); + this.exprValueFactory = exprValueFactory; + } + @Override public OpenSearchRequest build() { - return new ContinuePageRequest(scrollId, exprValueFactory); + return new ContinuePageRequest(scrollId, scrollTimeout, exprValueFactory); } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java index dee009ee974..8023a86006c 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java @@ -10,8 +10,9 @@ import java.util.Map; import java.util.Set; import lombok.Getter; +import org.opensearch.common.unit.TimeValue; import org.opensearch.search.builder.SearchSourceBuilder; -import org.opensearch.sql.data.type.ExprType; +import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @@ -27,6 +28,7 @@ public class InitialPageRequestBuilder extends PagedRequestBuilder { private final OpenSearchRequest.IndexName indexName; private final SearchSourceBuilder sourceBuilder; private final OpenSearchExprValueFactory exprValueFactory; + private final TimeValue scrollTimeout; /** * Constructor. @@ -37,9 +39,11 @@ public class InitialPageRequestBuilder extends PagedRequestBuilder { // TODO accept indexName as string (same way as `OpenSearchRequestBuilder` does)? public InitialPageRequestBuilder(OpenSearchRequest.IndexName indexName, int pageSize, + Settings settings, OpenSearchExprValueFactory exprValueFactory) { this.indexName = indexName; this.exprValueFactory = exprValueFactory; + this.scrollTimeout = settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE); this.sourceBuilder = new SearchSourceBuilder() .from(0) .size(pageSize) @@ -48,7 +52,7 @@ public InitialPageRequestBuilder(OpenSearchRequest.IndexName indexName, @Override public OpenSearchScrollRequest build() { - return new OpenSearchScrollRequest(indexName, sourceBuilder, exprValueFactory); + return new OpenSearchScrollRequest(indexName, scrollTimeout, sourceBuilder, exprValueFactory); } /** diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java index 531710d5458..6d5a8cf0054 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java @@ -76,6 +76,11 @@ public class OpenSearchRequestBuilder implements PushDownRequestBuilder { */ private int querySize; + /** + * Scroll context life time. + */ + private final TimeValue scrollTimeout; + public OpenSearchRequestBuilder(String indexName, Integer maxResultWindow, Settings settings, @@ -93,12 +98,13 @@ public OpenSearchRequestBuilder(OpenSearchRequest.IndexName indexName, OpenSearchExprValueFactory exprValueFactory) { this.indexName = indexName; this.maxResultWindow = maxResultWindow; - this.sourceBuilder = new SearchSourceBuilder(); this.exprValueFactory = exprValueFactory; + this.scrollTimeout = settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE); this.querySize = settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT); - sourceBuilder.from(0); - sourceBuilder.size(querySize); - sourceBuilder.timeout(DEFAULT_QUERY_TIMEOUT); + this.sourceBuilder = new SearchSourceBuilder() + .from(0) + .size(querySize) + .timeout(DEFAULT_QUERY_TIMEOUT); } /** @@ -112,7 +118,8 @@ public OpenSearchRequest build() { if (from + size > maxResultWindow) { sourceBuilder.size(maxResultWindow - from); - return new OpenSearchScrollRequest(indexName, sourceBuilder, exprValueFactory); + return new OpenSearchScrollRequest( + indexName, scrollTimeout, sourceBuilder, exprValueFactory); } else { return new OpenSearchQueryRequest(indexName, sourceBuilder, exprValueFactory); } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java index 8dceee99ee8..2e723c949c8 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java @@ -32,8 +32,8 @@ @ToString public class OpenSearchScrollRequest implements OpenSearchRequest { - /** Default scroll context timeout in minutes. */ - public static final TimeValue DEFAULT_SCROLL_TIMEOUT = TimeValue.timeValueMinutes(100L); + /** Scroll context timeout. */ + private final TimeValue scrollTimeout; /** * {@link OpenSearchRequest.IndexName}. @@ -58,22 +58,13 @@ public class OpenSearchScrollRequest implements OpenSearchRequest { /** Search request source builder. */ private final SearchSourceBuilder sourceBuilder; - /** Constructor. */ - public OpenSearchScrollRequest(IndexName indexName, OpenSearchExprValueFactory exprValueFactory) { - this.indexName = indexName; - this.sourceBuilder = new SearchSourceBuilder(); - this.exprValueFactory = exprValueFactory; - } - - public OpenSearchScrollRequest(String indexName, OpenSearchExprValueFactory exprValueFactory) { - this(new IndexName(indexName), exprValueFactory); - } - /** Constructor. */ public OpenSearchScrollRequest(IndexName indexName, + TimeValue scrollTimeout, SearchSourceBuilder sourceBuilder, OpenSearchExprValueFactory exprValueFactory) { this.indexName = indexName; + this.scrollTimeout = scrollTimeout; this.sourceBuilder = sourceBuilder; this.exprValueFactory = exprValueFactory; } @@ -117,7 +108,7 @@ public void clean(Consumer cleanAction) { public SearchRequest searchRequest() { return new SearchRequest() .indices(indexName.getIndexNames()) - .scroll(DEFAULT_SCROLL_TIMEOUT) + .scroll(scrollTimeout) .source(sourceBuilder); } @@ -137,7 +128,7 @@ public boolean isScroll() { */ public SearchScrollRequest scrollRequest() { Objects.requireNonNull(scrollId, "Scroll id cannot be null"); - return new SearchScrollRequest().scroll(DEFAULT_SCROLL_TIMEOUT).scrollId(scrollId); + return new SearchScrollRequest().scroll(scrollTimeout).scrollId(scrollId); } /** diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java index 288bb6006a6..110d3d640f0 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java @@ -165,7 +165,7 @@ public TableScanBuilder createScanBuilder() { @Override public TableScanBuilder createPagedScanBuilder(int pageSize) { - var requestBuilder = new InitialPageRequestBuilder(indexName, pageSize, + var requestBuilder = new InitialPageRequestBuilder(indexName, pageSize, settings, new OpenSearchExprValueFactory(getFieldOpenSearchTypes())); var indexScan = new OpenSearchPagedIndexScan(client, requestBuilder); return new OpenSearchPagedIndexScanBuilder(indexScan); diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java index a5f5f372ada..14535edb79a 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java @@ -45,7 +45,7 @@ public TableScanOperator getTableScan(String indexName, String scrollId) { var index = new OpenSearchIndex(client, settings, indexName); var requestBuilder = new ContinuePageRequestBuilder( new OpenSearchRequest.IndexName(indexName), - scrollId, + scrollId, settings, new OpenSearchExprValueFactory(index.getFieldOpenSearchTypes())); return new OpenSearchPagedIndexScan(client, requestBuilder); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java index 77872296031..dc9d7a5b5ed 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java @@ -61,6 +61,7 @@ import org.opensearch.cluster.metadata.MappingMetadata; import org.opensearch.common.collect.ImmutableOpenMap; import org.opensearch.common.settings.Settings; +import org.opensearch.common.unit.TimeValue; import org.opensearch.common.util.concurrent.ThreadContext; import org.opensearch.common.xcontent.XContentType; import org.opensearch.core.xcontent.DeprecationHandler; @@ -69,6 +70,7 @@ import org.opensearch.index.IndexNotFoundException; import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; +import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.sql.data.model.ExprIntegerValue; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; @@ -76,6 +78,7 @@ import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.mapping.IndexMapping; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; import org.opensearch.sql.opensearch.request.OpenSearchScrollRequest; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -322,7 +325,9 @@ void search() { when(scrollResponse.getHits()).thenReturn(SearchHits.empty()); // Verify response for first scroll request - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); OpenSearchResponse response1 = client.search(request); assertFalse(response1.isEmpty()); @@ -355,7 +360,9 @@ void cleanup() { when(requestBuilder.addScrollId(any())).thenReturn(requestBuilder); when(requestBuilder.get()).thenReturn(null); - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); request.setScrollId("scroll123"); // Enforce cleaning by setting a private field. FieldUtils.writeField(request, "needClean", true, true); @@ -370,7 +377,9 @@ void cleanup() { @Test void cleanup_without_scrollId() { - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); client.cleanup(request); verify(nodeClient, never()).prepareClearScroll(); } @@ -380,7 +389,9 @@ void cleanup_without_scrollId() { void cleanup_rethrows_exception() { when(nodeClient.prepareClearScroll()).thenThrow(new RuntimeException()); - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); request.setScrollId("scroll123"); // Enforce cleaning by setting a private field. FieldUtils.writeField(request, "needClean", true, true); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java index b8920e52a66..6abd17a6fbc 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java @@ -55,12 +55,14 @@ import org.opensearch.cluster.metadata.MappingMetadata; import org.opensearch.common.collect.ImmutableOpenMap; import org.opensearch.common.settings.Settings; +import org.opensearch.common.unit.TimeValue; import org.opensearch.common.xcontent.XContentType; import org.opensearch.core.xcontent.DeprecationHandler; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; +import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.sql.data.model.ExprIntegerValue; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; @@ -68,6 +70,7 @@ import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.mapping.IndexMapping; +import org.opensearch.sql.opensearch.request.OpenSearchRequest; import org.opensearch.sql.opensearch.request.OpenSearchScrollRequest; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -303,7 +306,9 @@ void search() throws IOException { when(scrollResponse.getHits()).thenReturn(SearchHits.empty()); // Verify response for first scroll request - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); OpenSearchResponse response1 = client.search(request); assertFalse(response1.isEmpty()); @@ -323,7 +328,9 @@ void search_with_IOException() throws IOException { when(restClient.search(any(), any())).thenThrow(new IOException()); assertThrows( IllegalStateException.class, - () -> client.search(new OpenSearchScrollRequest("test", factory))); + () -> client.search(new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory))); } @Test @@ -343,7 +350,9 @@ void scroll_with_IOException() throws IOException { when(restClient.scroll(any(), any())).thenThrow(new IOException()); // First request run successfully - OpenSearchScrollRequest scrollRequest = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest scrollRequest = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); client.search(scrollRequest); assertThrows( IllegalStateException.class, () -> client.search(scrollRequest)); @@ -362,7 +371,9 @@ void schedule() { @Test @SneakyThrows void cleanup() { - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); // Enforce cleaning by setting a private field. FieldUtils.writeField(request, "needClean", true, true); request.setScrollId("scroll123"); @@ -373,7 +384,9 @@ void cleanup() { @Test void cleanup_without_scrollId() throws IOException { - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); client.cleanup(request); verify(restClient, never()).clearScroll(any(), any()); } @@ -383,7 +396,9 @@ void cleanup_without_scrollId() throws IOException { void cleanup_with_IOException() { when(restClient.clearScroll(any(), any())).thenThrow(new IOException()); - OpenSearchScrollRequest request = new OpenSearchScrollRequest("test", factory); + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); // Enforce cleaning by setting a private field. FieldUtils.writeField(request, "needClean", true, true); request.setScrollId("scroll123"); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index b6b0269625a..d762fbe2faa 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -18,6 +18,7 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.opensearch.sql.common.setting.Settings.Key.QUERY_SIZE_LIMIT; +import static org.opensearch.sql.common.setting.Settings.Key.SQL_CURSOR_KEEP_ALIVE; import static org.opensearch.sql.data.model.ExprValueUtils.tupleValue; import static org.opensearch.sql.executor.ExecutionEngine.QueryResponse; @@ -35,6 +36,7 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.common.unit.TimeValue; import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.data.model.ExprValue; @@ -171,6 +173,9 @@ void explain_successfully() { new PaginatedPlanCache(null)); Settings settings = mock(Settings.class); when(settings.getSettingValue(QUERY_SIZE_LIMIT)).thenReturn(100); + when(settings.getSettingValue(SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); + PhysicalPlan plan = new OpenSearchIndexScan(mock(OpenSearchClient.class), new OpenSearchRequestBuilder("test", 10000, settings, mock(OpenSearchExprValueFactory.class))); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java index d0e486fae9c..cf684b9409c 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java @@ -11,6 +11,8 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; import static org.opensearch.sql.ast.tree.Sort.SortOption.DEFAULT_ASC; +import static org.opensearch.sql.common.setting.Settings.Key.QUERY_SIZE_LIMIT; +import static org.opensearch.sql.common.setting.Settings.Key.SQL_CURSOR_KEEP_ALIVE; import static org.opensearch.sql.data.type.ExprCoreType.DOUBLE; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; import static org.opensearch.sql.data.type.ExprCoreType.STRING; @@ -36,6 +38,7 @@ import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.client.node.NodeClient; +import org.opensearch.common.unit.TimeValue; import org.opensearch.sql.ast.expression.DataType; import org.opensearch.sql.ast.expression.Literal; import org.opensearch.sql.ast.tree.RareTopN.CommandType; @@ -88,8 +91,9 @@ public void setup() { @Test public void testProtectIndexScan() { - when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); - + when(settings.getSettingValue(QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); String indexName = "test"; Integer maxResultWindow = 10000; NamedExpression include = named("age", ref("age", INTEGER)); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java index d549ed9200f..e449126d1ca 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java @@ -6,6 +6,7 @@ package org.opensearch.sql.opensearch.request; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.mockito.Mockito.when; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayNameGeneration; @@ -14,6 +15,8 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.common.unit.TimeValue; +import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @@ -23,6 +26,9 @@ public class ContinuePageRequestBuilderTest { @Mock private OpenSearchExprValueFactory exprValueFactory; + @Mock + private Settings settings; + private final OpenSearchRequest.IndexName indexName = new OpenSearchRequest.IndexName("test"); private final String scrollId = "scroll"; @@ -30,13 +36,16 @@ public class ContinuePageRequestBuilderTest { @BeforeEach void setup() { - requestBuilder = new ContinuePageRequestBuilder(indexName, scrollId, exprValueFactory); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); + requestBuilder = new ContinuePageRequestBuilder( + indexName, scrollId, settings, exprValueFactory); } @Test public void build() { assertEquals( - new ContinuePageRequest(scrollId, exprValueFactory), + new ContinuePageRequest(scrollId, TimeValue.timeValueMinutes(1), exprValueFactory), requestBuilder.build() ); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java index 32a15f5e8c6..e991fc5787d 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestTest.java @@ -33,6 +33,7 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchScrollRequest; +import org.opensearch.common.unit.TimeValue; import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @@ -66,7 +67,8 @@ public class ContinuePageRequestTest { private final String scroll = "scroll"; private final String nextScroll = "nextScroll"; - private final ContinuePageRequest request = new ContinuePageRequest(scroll, factory); + private final ContinuePageRequest request = new ContinuePageRequest( + scroll, TimeValue.timeValueMinutes(1), factory); @Test public void search_with_non_empty_response() { @@ -118,7 +120,7 @@ public void getters() { factory = mock(); assertAll( () -> assertThrows(Throwable.class, request::getSourceBuilder), - () -> assertSame(factory, new ContinuePageRequest("", factory).getExprValueFactory()) + () -> assertSame(factory, new ContinuePageRequest("", null, factory).getExprValueFactory()) ); } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java index beebb6a0ac5..9d4c0b8dbe4 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java @@ -10,6 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; @@ -22,7 +23,9 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.common.unit.TimeValue; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.ReferenceExpression; @@ -36,6 +39,9 @@ public class InitialPageRequestBuilderTest { @Mock private OpenSearchExprValueFactory exprValueFactory; + @Mock + private Settings settings; + private final int pageSize = 42; private final OpenSearchRequest.IndexName indexName = new OpenSearchRequest.IndexName("test"); @@ -44,14 +50,16 @@ public class InitialPageRequestBuilderTest { @BeforeEach void setup() { + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); requestBuilder = new InitialPageRequestBuilder( - indexName, pageSize, exprValueFactory); + indexName, pageSize, settings, exprValueFactory); } @Test public void build() { assertEquals( - new OpenSearchScrollRequest(indexName, + new OpenSearchScrollRequest(indexName, TimeValue.timeValueMinutes(1), new SearchSourceBuilder() .from(0) .size(pageSize) @@ -91,7 +99,7 @@ public void pushDownProject() { requestBuilder.pushDownProjects(references); assertEquals( - new OpenSearchScrollRequest(indexName, + new OpenSearchScrollRequest(indexName, TimeValue.timeValueMinutes(1), new SearchSourceBuilder() .from(0) .size(pageSize) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java index 636142207eb..49283e61b95 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java @@ -65,6 +65,8 @@ public class OpenSearchRequestBuilderTest { @BeforeEach void setup() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); requestBuilder = new OpenSearchRequestBuilder( "test", MAX_RESULT_WINDOW, settings, exprValueFactory); @@ -95,7 +97,7 @@ void build_scroll_request_with_correct_size() { assertEquals( new OpenSearchScrollRequest( - new OpenSearchRequest.IndexName("test"), + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), new SearchSourceBuilder() .from(offset) .size(MAX_RESULT_WINDOW - offset) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java index 6e45476306e..3ad6ad226db 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java @@ -25,6 +25,7 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchScrollRequest; +import org.opensearch.common.unit.TimeValue; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; @@ -38,8 +39,9 @@ class OpenSearchScrollRequestTest { @Mock private OpenSearchExprValueFactory factory; - private final OpenSearchScrollRequest request = - new OpenSearchScrollRequest("test", factory); + private final OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), TimeValue.timeValueMinutes(1), + new SearchSourceBuilder(), factory); @Test void searchRequest() { @@ -48,7 +50,7 @@ void searchRequest() { assertEquals( new SearchRequest() .indices("test") - .scroll(OpenSearchScrollRequest.DEFAULT_SCROLL_TIMEOUT) + .scroll(TimeValue.timeValueMinutes(1)) .source(new SearchSourceBuilder().query(QueryBuilders.termQuery("name", "John"))), request.searchRequest()); } @@ -69,7 +71,7 @@ void scrollRequest() { request.setScrollId("scroll123"); assertEquals( new SearchScrollRequest() - .scroll(OpenSearchScrollRequest.DEFAULT_SCROLL_TIMEOUT) + .scroll(TimeValue.timeValueMinutes(1)) .scrollId("scroll123"), request.scrollRequest()); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java index 6705c1ef022..7181bd5e565 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java @@ -14,6 +14,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.Mockito.doNothing; import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; import static org.opensearch.sql.data.type.ExprCoreType.DOUBLE; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; @@ -41,6 +42,7 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.common.unit.TimeValue; import org.opensearch.sql.ast.tree.Sort; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.data.model.ExprBooleanValue; @@ -192,6 +194,8 @@ void checkCacheUsedForFieldMappings() { @Test void implementRelationOperatorOnly() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); when(client.getIndexMaxResultWindows("test")).thenReturn(Map.of("test", 10000)); LogicalPlan plan = index.createScanBuilder(); @@ -205,17 +209,22 @@ void implementRelationOperatorOnly() { @Test void implementPagedRelationOperatorOnly() { when(client.getIndexMaxResultWindows("test")).thenReturn(Map.of("test", 10000)); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); LogicalPlan plan = index.createPagedScanBuilder(42); Integer maxResultWindow = index.getMaxResultWindow(); PagedRequestBuilder builder = new InitialPageRequestBuilder( - new OpenSearchRequest.IndexName(indexName), maxResultWindow, exprValueFactory); + new OpenSearchRequest.IndexName(indexName), + maxResultWindow, mock(), exprValueFactory); assertEquals(new OpenSearchPagedIndexScan(client, builder), index.implement(plan)); } @Test void implementRelationOperatorWithOptimization() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); when(client.getIndexMaxResultWindows("test")).thenReturn(Map.of("test", 10000)); LogicalPlan plan = index.createScanBuilder(); @@ -231,6 +240,8 @@ void implementRelationOperatorWithOptimization() { @Test void implementOtherLogicalOperators() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); when(client.getIndexMaxResultWindows("test")).thenReturn(Map.of("test", 10000)); NamedExpression include = named("age", ref("age", INTEGER)); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java index 8cc0d468843..c133897ca25 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java @@ -32,6 +32,7 @@ import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.stubbing.Answer; import org.opensearch.common.bytes.BytesArray; +import org.opensearch.common.unit.TimeValue; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.SearchHit; @@ -67,6 +68,8 @@ class OpenSearchIndexScanTest { @BeforeEach void setup() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); + when(settings.getSettingValue(Settings.Key.SQL_CURSOR_KEEP_ALIVE)) + .thenReturn(TimeValue.timeValueMinutes(1)); } @Test diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java index 65c0ddffc2a..38888115c91 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java @@ -53,7 +53,7 @@ public class OpenSearchPagedIndexScanTest { void query_empty_result() { mockResponse(client); InitialPageRequestBuilder builder = new InitialPageRequestBuilder( - new OpenSearchRequest.IndexName("test"), 3, exprValueFactory); + new OpenSearchRequest.IndexName("test"), 3, mock(), exprValueFactory); try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { indexScan.open(); assertFalse(indexScan.hasNext()); @@ -69,7 +69,7 @@ void query_all_results_initial_scroll_request() { employee(3, "Allen", "IT")}); PagedRequestBuilder builder = new InitialPageRequestBuilder( - new OpenSearchRequest.IndexName("test"), 3, exprValueFactory); + new OpenSearchRequest.IndexName("test"), 3, mock(), exprValueFactory); try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { indexScan.open(); @@ -90,7 +90,7 @@ void query_all_results_initial_scroll_request() { verify(client).cleanup(any()); builder = new ContinuePageRequestBuilder( - new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + new OpenSearchRequest.IndexName("test"), "scroll", mock(), exprValueFactory); try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { indexScan.open(); @@ -107,7 +107,7 @@ void query_all_results_continuation_scroll_request() { employee(3, "Allen", "IT")}); ContinuePageRequestBuilder builder = new ContinuePageRequestBuilder( - new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + new OpenSearchRequest.IndexName("test"), "scroll", mock(), exprValueFactory); try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { indexScan.open(); @@ -128,7 +128,7 @@ void query_all_results_continuation_scroll_request() { verify(client).cleanup(any()); builder = new ContinuePageRequestBuilder( - new OpenSearchRequest.IndexName("test"), "scroll", exprValueFactory); + new OpenSearchRequest.IndexName("test"), "scroll", mock(), exprValueFactory); try (OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder)) { indexScan.open(); From 37e7ebfa5476483da951b3506e1151ba9ff1090e Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Mon, 3 Apr 2023 16:04:05 -0700 Subject: [PATCH 03/17] Fix IT to set cursor keep alive parameter. Signed-off-by: Yury-Fridlyand --- .../org/opensearch/sql/sql/StandalonePaginationIT.java | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java index 8f666878216..16eb9d1ff86 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java @@ -25,6 +25,7 @@ import org.opensearch.client.RestHighLevelClient; import org.opensearch.common.inject.Injector; import org.opensearch.common.inject.ModulesBuilder; +import org.opensearch.common.unit.TimeValue; import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.data.type.ExprCoreType; @@ -94,7 +95,8 @@ public void onResponse(ExecutionEngine.QueryResponse response) { @Override public void onFailure(Exception e) { - fail(); + e.printStackTrace(); + fail(e.getMessage()); } }; @@ -150,8 +152,9 @@ public void test_explain_not_supported() { private Settings defaultSettings() { return new Settings() { - private final Map defaultSettings = new ImmutableMap.Builder() + private final Map defaultSettings = new ImmutableMap.Builder() .put(Key.QUERY_SIZE_LIMIT, 200) + .put(Key.SQL_CURSOR_KEEP_ALIVE, TimeValue.timeValueMinutes(1)) .build(); @Override From 529df99aa77259e99f7ef76b45f99ee9dacf66ed Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 12 Apr 2023 17:38:49 -0700 Subject: [PATCH 04/17] Remove `QueryId.None`. Signed-off-by: Yury-Fridlyand --- core/src/main/java/org/opensearch/sql/executor/QueryId.java | 1 - .../sql/executor/execution/ContinuePaginatedPlanTest.java | 4 ++-- .../opensearch/sql/executor/execution/PaginatedPlanTest.java | 4 ++-- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/core/src/main/java/org/opensearch/sql/executor/QueryId.java b/core/src/main/java/org/opensearch/sql/executor/QueryId.java index 43d6aed85eb..933cb5d82dc 100644 --- a/core/src/main/java/org/opensearch/sql/executor/QueryId.java +++ b/core/src/main/java/org/opensearch/sql/executor/QueryId.java @@ -16,7 +16,6 @@ * Query id of {@link AbstractPlan}. */ public class QueryId { - public static final QueryId None = new QueryId(""); /** * Query id. */ diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java index 7ad2390e45d..a5abe38f24c 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java @@ -62,7 +62,7 @@ public void onFailure(Exception e) { fail(); } }; - var plan = new ContinuePaginatedPlan(QueryId.None, buildCursor(Map.of()), + var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of()), queryService, paginatedPlanCache, listener); plan.execute(); } @@ -81,7 +81,7 @@ public void onFailure(Exception e) { assertNotNull(e); } }; - var plan = new ContinuePaginatedPlan(QueryId.None, buildCursor(Map.of("pageSize", "abc")), + var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of("pageSize", "abc")), queryService, paginatedPlanCache, listener); plan.execute(); } diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java index 16933b9b791..ac1a3fb7e83 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java @@ -59,7 +59,7 @@ public void onFailure(Exception e) { fail(); } }; - var plan = new PaginatedPlan(QueryId.None, mock(UnresolvedPlan.class), 10, + var plan = new PaginatedPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, queryService, listener); plan.execute(); } @@ -78,7 +78,7 @@ public void onFailure(Exception e) { assertNotNull(e); } }; - var plan = new PaginatedPlan(QueryId.None, mock(UnresolvedPlan.class), 10, + var plan = new PaginatedPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, new QueryService(null, new DefaultExecutionEngine(), null, null), listener); plan.execute(); } From 64386f30d5ea3c21ed3882773bfb75cf7516b9f5 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 12 Apr 2023 17:42:35 -0700 Subject: [PATCH 05/17] Rename according to PR feedback. Signed-off-by: Yury-Fridlyand --- .../executor/execution/ContinuePaginatedPlan.java | 10 +++++----- .../sql/executor/execution/QueryPlanFactory.java | 8 ++++---- ...PaginatedPlanCache.java => PlanSerializer.java} | 4 ++-- .../execution/ContinuePaginatedPlanTest.java | 12 ++++++------ .../executor/execution/QueryPlanFactoryTest.java | 14 +++++++------- ...dPlanCacheTest.java => PlanSerializerTest.java} | 10 +++++----- .../java/org/opensearch/sql/ppl/StandaloneIT.java | 14 +++++++------- .../opensearch/sql/sql/StandalonePaginationIT.java | 8 ++++---- .../org/opensearch/sql/util/StandaloneModule.java | 14 +++++++------- .../java/org/opensearch/sql/util/TestUtils.java | 2 +- .../executor/OpenSearchExecutionEngine.java | 6 +++--- .../executor/OpenSearchExecutionEngineTest.java | 14 +++++++------- .../sql/plugin/config/OpenSearchPluginModule.java | 14 +++++++------- .../org/opensearch/sql/ppl/PPLServiceTest.java | 6 +++--- .../org/opensearch/sql/sql/SQLServiceTest.java | 6 +++--- 15 files changed, 71 insertions(+), 71 deletions(-) rename core/src/main/java/org/opensearch/sql/executor/pagination/{PaginatedPlanCache.java => PlanSerializer.java} (97%) rename core/src/test/java/org/opensearch/sql/executor/pagination/{PaginatedPlanCacheTest.java => PlanSerializerTest.java} (99%) diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java index 03309359a1a..ffbf2976878 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java @@ -9,7 +9,7 @@ import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.planner.physical.PhysicalPlan; /** @@ -21,7 +21,7 @@ public class ContinuePaginatedPlan extends AbstractPlan { private final String cursor; private final QueryService queryService; - private final PaginatedPlanCache paginatedPlanCache; + private final PlanSerializer planSerializer; private final ResponseListener queryResponseListener; @@ -30,12 +30,12 @@ public class ContinuePaginatedPlan extends AbstractPlan { * Create an abstract plan that can continue paginating a given cursor. */ public ContinuePaginatedPlan(QueryId queryId, String cursor, QueryService queryService, - PaginatedPlanCache planCache, + PlanSerializer planCache, ResponseListener queryResponseListener) { super(queryId); this.cursor = cursor; - this.paginatedPlanCache = planCache; + this.planSerializer = planCache; this.queryService = queryService; this.queryResponseListener = queryResponseListener; } @@ -43,7 +43,7 @@ public ContinuePaginatedPlan(QueryId queryId, String cursor, QueryService queryS @Override public void execute() { try { - PhysicalPlan plan = paginatedPlanCache.convertToPlan(cursor); + PhysicalPlan plan = planSerializer.convertToPlan(cursor); queryService.executePlan(plan, queryResponseListener); } catch (Exception e) { queryResponseListener.onFailure(e); diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java index cabbfbff8ea..bdd978cccec 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java @@ -22,7 +22,7 @@ import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; /** * QueryExecution Factory. @@ -39,7 +39,7 @@ public class QueryPlanFactory * Query Service. */ private final QueryService queryService; - private final PaginatedPlanCache paginatedPlanCache; + private final PlanSerializer planSerializer; /** * NO_CONSUMER_RESPONSE_LISTENER should never be called. It is only used as constructor @@ -80,7 +80,7 @@ public AbstractPlan createContinuePaginatedPlan(String cursor, boolean isExplain ResponseListener explainListener) { QueryId queryId = QueryId.queryId(); var plan = new ContinuePaginatedPlan(queryId, cursor, queryService, - paginatedPlanCache, queryResponseListener); + planSerializer, queryResponseListener); return isExplain ? new ExplainPlan(queryId, plan, explainListener) : plan; } @@ -94,7 +94,7 @@ public AbstractPlan visitQuery( context.getLeft().isPresent(), "[BUG] query listener must be not null"); if (node.getFetchSize() > 0) { - if (paginatedPlanCache.canConvertToCursor(node.getPlan())) { + if (planSerializer.canConvertToCursor(node.getPlan())) { return new PaginatedPlan(QueryId.queryId(), node.getPlan(), node.getFetchSize(), queryService, context.getLeft().get()); diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java b/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java similarity index 97% rename from core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java rename to core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java index 89c008aa662..d9915e2b8dc 100644 --- a/core/src/main/java/org/opensearch/sql/executor/pagination/PaginatedPlanCache.java +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java @@ -28,7 +28,7 @@ * and deserialization. */ @RequiredArgsConstructor -public class PaginatedPlanCache { +public class PlanSerializer { public static final String CURSOR_PREFIX = "n:"; private final StorageEngine storageEngine; @@ -68,7 +68,7 @@ String compress(String str) throws IOException { } /** - * Decompresses a query plan that was compress with {@link PaginatedPlanCache#compress}. + * Decompresses a query plan that was compress with {@link PlanSerializer#compress}. * @param input compressed query plan * @return decompressed string */ diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java index a5abe38f24c..e0289872b7a 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java @@ -14,7 +14,7 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.mockito.Mockito.withSettings; -import static org.opensearch.sql.executor.pagination.PaginatedPlanCacheTest.buildCursor; +import static org.opensearch.sql.executor.pagination.PlanSerializerTest.buildCursor; import java.util.Map; import org.junit.jupiter.api.BeforeAll; @@ -26,14 +26,14 @@ import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.storage.StorageEngine; import org.opensearch.sql.storage.TableScanOperator; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class ContinuePaginatedPlanTest { - private static PaginatedPlanCache paginatedPlanCache; + private static PlanSerializer planSerializer; private static QueryService queryService; @@ -45,7 +45,7 @@ public static void setUp() { var storageEngine = mock(StorageEngine.class); when(storageEngine.getTableScan(anyString(), anyString())) .thenReturn(mock(TableScanOperator.class)); - paginatedPlanCache = new PaginatedPlanCache(storageEngine); + planSerializer = new PlanSerializer(storageEngine); queryService = new QueryService(null, new DefaultExecutionEngine(), null, null); } @@ -63,7 +63,7 @@ public void onFailure(Exception e) { } }; var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of()), - queryService, paginatedPlanCache, listener); + queryService, planSerializer, listener); plan.execute(); } @@ -82,7 +82,7 @@ public void onFailure(Exception e) { } }; var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of("pageSize", "abc")), - queryService, paginatedPlanCache, listener); + queryService, planSerializer, listener); plan.execute(); } diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java index c06b1186cd1..5a4c7e9814b 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java @@ -29,7 +29,7 @@ import org.opensearch.sql.exception.UnsupportedCursorRequestException; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryService; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; @ExtendWith(MockitoExtension.class) class QueryPlanFactoryTest { @@ -50,12 +50,12 @@ class QueryPlanFactoryTest { private ExecutionEngine.QueryResponse queryResponse; @Mock - private PaginatedPlanCache paginatedPlanCache; + private PlanSerializer planSerializer; private QueryPlanFactory factory; @BeforeEach void init() { - factory = new QueryPlanFactory(queryService, paginatedPlanCache); + factory = new QueryPlanFactory(queryService, planSerializer); } @Test @@ -125,8 +125,8 @@ public void noConsumerResponseChannel() { @Test public void createQueryWithFetchSizeWhichCanBePaged() { - when(paginatedPlanCache.canConvertToCursor(plan)).thenReturn(true); - factory = new QueryPlanFactory(queryService, paginatedPlanCache); + when(planSerializer.canConvertToCursor(plan)).thenReturn(true); + factory = new QueryPlanFactory(queryService, planSerializer); Statement query = new Query(plan, 10); AbstractPlan queryExecution = factory.createContinuePaginatedPlan(query, Optional.of(queryListener), Optional.empty()); @@ -135,8 +135,8 @@ public void createQueryWithFetchSizeWhichCanBePaged() { @Test public void createQueryWithFetchSizeWhichCannotBePaged() { - when(paginatedPlanCache.canConvertToCursor(plan)).thenReturn(false); - factory = new QueryPlanFactory(queryService, paginatedPlanCache); + when(planSerializer.canConvertToCursor(plan)).thenReturn(false); + factory = new QueryPlanFactory(queryService, planSerializer); Statement query = new Query(plan, 10); assertThrows(UnsupportedCursorRequestException.class, () -> factory.createContinuePaginatedPlan(query, diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java similarity index 99% rename from core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java rename to core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java index c3feb6e606d..7db431ed910 100644 --- a/core/src/test/java/org/opensearch/sql/executor/pagination/PaginatedPlanCacheTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java @@ -35,11 +35,11 @@ import org.opensearch.sql.storage.TableScanOperator; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) -public class PaginatedPlanCacheTest { +public class PlanSerializerTest { StorageEngine storageEngine; - PaginatedPlanCache planCache; + PlanSerializer planCache; // encoded query 'select * from cacls' o_O static final String testCursor = "(Paginate,1,2,(Project," @@ -275,7 +275,7 @@ void setUp() { storageEngine = mock(StorageEngine.class); when(storageEngine.getTableScan(anyString(), anyString())) .thenReturn(new MockedTableScanOperator()); - planCache = new PaginatedPlanCache(storageEngine); + planCache = new PlanSerializer(storageEngine); } @Test @@ -449,11 +449,11 @@ public String toCursor() { @SneakyThrows private static String compress(String input) { - return new PaginatedPlanCache(null).compress(input); + return new PlanSerializer(null).compress(input); } @SneakyThrows private static String decompress(String input) { - return new PaginatedPlanCache(null).decompress(input); + return new PlanSerializer(null).decompress(input); } } diff --git a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java index ee568b7dbd4..8664c7c9484 100644 --- a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java @@ -32,7 +32,7 @@ import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; import org.opensearch.sql.monitor.AlwaysHealthyMonitor; import org.opensearch.sql.monitor.ResourceMonitor; @@ -198,8 +198,8 @@ public StorageEngine storageEngine(OpenSearchClient client) { @Provides public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, - PaginatedPlanCache paginatedPlanCache) { - return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); + PlanSerializer planSerializer) { + return new OpenSearchExecutionEngine(client, protector, planSerializer); } @Provides @@ -229,20 +229,20 @@ public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPl } @Provides - public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { - return new PaginatedPlanCache(storageEngine); + public PlanSerializer paginatedPlanCache(StorageEngine storageEngine) { + return new PlanSerializer(storageEngine); } @Provides public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine, - PaginatedPlanCache paginatedPlanCache) { + PlanSerializer planSerializer) { Analyzer analyzer = new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); QueryService queryService = new QueryService(analyzer, executionEngine, planner, paginationPlanner); - return new QueryPlanFactory(queryService, paginatedPlanCache); + return new QueryPlanFactory(queryService, planSerializer); } } diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java index 16eb9d1ff86..9ed91c4e38e 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java @@ -32,7 +32,7 @@ import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.datasource.DataSourceServiceImpl; import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.expression.DSL; import org.opensearch.sql.legacy.SQLIntegTestCase; @@ -56,7 +56,7 @@ public class StandalonePaginationIT extends SQLIntegTestCase { private QueryService queryService; - private PaginatedPlanCache paginatedPlanCache; + private PlanSerializer planSerializer; private OpenSearchClient client; @@ -79,7 +79,7 @@ public void init() { Injector injector = modules.createInjector(); queryService = injector.getInstance(QueryService.class); - paginatedPlanCache = injector.getInstance(PaginatedPlanCache.class); + planSerializer = injector.getInstance(PlanSerializer.class); } @Test @@ -124,7 +124,7 @@ public void onFailure(Exception e) { // act 2, asserts in secondResponder - PhysicalPlan plan = paginatedPlanCache.convertToPlan(firstResponder.getCursor().toString()); + PhysicalPlan plan = planSerializer.convertToPlan(firstResponder.getCursor().toString()); var secondResponder = new TestResponder(); queryService.executePlan(plan, secondResponder); diff --git a/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java index c7515b461f8..a5d8a19aeb2 100644 --- a/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java +++ b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java @@ -15,7 +15,7 @@ import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; @@ -68,8 +68,8 @@ public StorageEngine storageEngine(OpenSearchClient client) { @Provides public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, - PaginatedPlanCache paginatedPlanCache) { - return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); + PlanSerializer planSerializer) { + return new OpenSearchExecutionEngine(client, protector, planSerializer); } @Provides @@ -99,16 +99,16 @@ public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPl } @Provides - public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { - return new PaginatedPlanCache(storageEngine); + public PlanSerializer paginatedPlanCache(StorageEngine storageEngine) { + return new PlanSerializer(storageEngine); } @Provides public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine, - PaginatedPlanCache paginatedPlanCache, + PlanSerializer planSerializer, QueryService qs) { - return new QueryPlanFactory(qs, paginatedPlanCache); + return new QueryPlanFactory(qs, planSerializer); } @Provides diff --git a/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java b/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java index 80ce24ecacf..69f16491903 100644 --- a/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java +++ b/integ-test/src/test/java/org/opensearch/sql/util/TestUtils.java @@ -8,7 +8,7 @@ import static com.google.common.base.Strings.isNullOrEmpty; import static org.junit.Assert.assertTrue; -import static org.opensearch.sql.executor.pagination.PaginatedPlanCache.CURSOR_PREFIX; +import static org.opensearch.sql.executor.pagination.PlanSerializer.CURSOR_PREFIX; import java.io.BufferedReader; import java.io.File; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java index 103e15e6cdf..bfc29b02d21 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngine.java @@ -15,7 +15,7 @@ import org.opensearch.sql.executor.ExecutionContext; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.Explain; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.executor.protector.ExecutionProtector; import org.opensearch.sql.planner.physical.PhysicalPlan; @@ -28,7 +28,7 @@ public class OpenSearchExecutionEngine implements ExecutionEngine { private final OpenSearchClient client; private final ExecutionProtector executionProtector; - private final PaginatedPlanCache paginatedPlanCache; + private final PlanSerializer planSerializer; @Override public void execute(PhysicalPlan physicalPlan, ResponseListener listener) { @@ -52,7 +52,7 @@ public void execute(PhysicalPlan physicalPlan, ExecutionContext context, } QueryResponse response = new QueryResponse(physicalPlan.schema(), result, - plan.getTotalHits(), paginatedPlanCache.convertToCursor(plan)); + plan.getTotalHits(), planSerializer.convertToCursor(plan)); listener.onResponse(response); } catch (Exception e) { listener.onFailure(e); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index d762fbe2faa..32f812bfb63 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -43,7 +43,7 @@ import org.opensearch.sql.executor.ExecutionContext; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.ExecutionEngine.ExplainResponse; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; @@ -90,7 +90,7 @@ void execute_successfully() { when(protector.protect(plan)).thenReturn(plan); OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); List actual = new ArrayList<>(); executor.execute( plan, @@ -120,7 +120,7 @@ void execute_with_cursor() { when(protector.protect(plan)).thenReturn(plan); OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); List actual = new ArrayList<>(); executor.execute( plan, @@ -148,7 +148,7 @@ void execute_with_failure() { when(protector.protect(plan)).thenReturn(plan); OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); AtomicReference actual = new AtomicReference<>(); executor.execute( plan, @@ -170,7 +170,7 @@ public void onFailure(Exception e) { @Test void explain_successfully() { OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); Settings settings = mock(Settings.class); when(settings.getSettingValue(QUERY_SIZE_LIMIT)).thenReturn(100); when(settings.getSettingValue(SQL_CURSOR_KEEP_ALIVE)) @@ -199,7 +199,7 @@ public void onFailure(Exception e) { @Test void explain_with_failure() { OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); PhysicalPlan plan = mock(PhysicalPlan.class); when(plan.accept(any(), any())).thenThrow(IllegalStateException.class); @@ -229,7 +229,7 @@ void call_add_split_and_open_in_order() { when(executionContext.getSplit()).thenReturn(Optional.of(split)); OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, - new PaginatedPlanCache(null)); + new PlanSerializer(null)); List actual = new ArrayList<>(); executor.execute( plan, diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java index b0c698a0cf2..7461b7df3b1 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java @@ -18,7 +18,7 @@ import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; import org.opensearch.sql.monitor.ResourceMonitor; import org.opensearch.sql.opensearch.client.OpenSearchClient; @@ -60,8 +60,8 @@ public StorageEngine storageEngine(OpenSearchClient client, Settings settings) { @Provides public ExecutionEngine executionEngine(OpenSearchClient client, ExecutionProtector protector, - PaginatedPlanCache paginatedPlanCache) { - return new OpenSearchExecutionEngine(client, protector, paginatedPlanCache); + PlanSerializer planSerializer) { + return new OpenSearchExecutionEngine(client, protector, planSerializer); } @Provides @@ -75,8 +75,8 @@ public ExecutionProtector protector(ResourceMonitor resourceMonitor) { } @Provides - public PaginatedPlanCache paginatedPlanCache(StorageEngine storageEngine) { - return new PaginatedPlanCache(storageEngine); + public PlanSerializer paginatedPlanCache(StorageEngine storageEngine) { + return new PlanSerializer(storageEngine); } @Provides @@ -101,7 +101,7 @@ public SQLService sqlService(QueryManager queryManager, QueryPlanFactory queryPl @Provides public QueryPlanFactory queryPlanFactory(DataSourceService dataSourceService, ExecutionEngine executionEngine, - PaginatedPlanCache paginatedPlanCache) { + PlanSerializer planSerializer) { Analyzer analyzer = new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); @@ -109,6 +109,6 @@ public QueryPlanFactory queryPlanFactory(DataSourceService dataSourceService, Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); QueryService queryService = new QueryService( analyzer, executionEngine, planner, paginationPlanner); - return new QueryPlanFactory(queryService, paginatedPlanCache); + return new QueryPlanFactory(queryService, planSerializer); } } diff --git a/ppl/src/test/java/org/opensearch/sql/ppl/PPLServiceTest.java b/ppl/src/test/java/org/opensearch/sql/ppl/PPLServiceTest.java index ef0dfa0c147..117aca50bfc 100644 --- a/ppl/src/test/java/org/opensearch/sql/ppl/PPLServiceTest.java +++ b/ppl/src/test/java/org/opensearch/sql/ppl/PPLServiceTest.java @@ -27,7 +27,7 @@ import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; import org.opensearch.sql.executor.pagination.Cursor; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.ppl.antlr.PPLSyntaxParser; import org.opensearch.sql.ppl.domain.PPLQueryRequest; @@ -49,7 +49,7 @@ public class PPLServiceTest { private ExecutionEngine.Schema schema; @Mock - private PaginatedPlanCache paginatedPlanCache; + private PlanSerializer planSerializer; /** * Setup the test context. @@ -59,7 +59,7 @@ public void setUp() { queryManager = DefaultQueryManager.defaultQueryManager(); pplService = new PPLService(new PPLSyntaxParser(), queryManager, - new QueryPlanFactory(queryService, paginatedPlanCache)); + new QueryPlanFactory(queryService, planSerializer)); } @After diff --git a/sql/src/test/java/org/opensearch/sql/sql/SQLServiceTest.java b/sql/src/test/java/org/opensearch/sql/sql/SQLServiceTest.java index 3f24b2bf44c..39c27c5e069 100644 --- a/sql/src/test/java/org/opensearch/sql/sql/SQLServiceTest.java +++ b/sql/src/test/java/org/opensearch/sql/sql/SQLServiceTest.java @@ -30,7 +30,7 @@ import org.opensearch.sql.executor.ExecutionEngine.ExplainResponseNode; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; -import org.opensearch.sql.executor.pagination.PaginatedPlanCache; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.sql.antlr.SQLSyntaxParser; import org.opensearch.sql.sql.domain.SQLQueryRequest; @@ -50,13 +50,13 @@ class SQLServiceTest { private QueryService queryService; @Mock - private PaginatedPlanCache paginatedPlanCache; + private PlanSerializer planSerializer; @BeforeEach public void setUp() { queryManager = DefaultQueryManager.defaultQueryManager(); sqlService = new SQLService(new SQLSyntaxParser(), queryManager, - new QueryPlanFactory(queryService, paginatedPlanCache)); + new QueryPlanFactory(queryService, planSerializer)); } @AfterEach From bc8c73fe591a8f992f4046b66acbf3592d073114 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 12 Apr 2023 17:54:57 -0700 Subject: [PATCH 06/17] Remove default implementations of `PushDownRequestBuilder`. Signed-off-by: Yury-Fridlyand --- .../request/ContinuePageRequestBuilder.java | 47 +++++++++++++++++++ .../request/InitialPageRequestBuilder.java | 33 +++++++++++++ .../request/PushDownRequestBuilder.java | 35 ++++---------- .../ContinuePageRequestBuilderTest.java | 24 ++++++++++ .../request/PushDownRequestBuilderTest.java | 44 ----------------- 5 files changed, 112 insertions(+), 71 deletions(-) delete mode 100644 opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java index a0c19c1d0ab..149e7a55414 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java @@ -5,10 +5,21 @@ package org.opensearch.sql.opensearch.request; +import java.util.List; +import java.util.Map; +import java.util.Set; import lombok.Getter; +import org.apache.commons.lang3.tuple.Pair; import org.opensearch.common.unit.TimeValue; +import org.opensearch.index.query.QueryBuilder; +import org.opensearch.search.aggregations.AggregationBuilder; +import org.opensearch.search.sort.SortBuilder; +import org.opensearch.sql.ast.expression.Literal; import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; /** * Builds a {@link ContinuePageRequest} to handle subsequent pagination/scroll/cursor requests. @@ -37,4 +48,40 @@ public ContinuePageRequestBuilder(OpenSearchRequest.IndexName indexName, public OpenSearchRequest build() { return new ContinuePageRequest(scrollId, scrollTimeout, exprValueFactory); } + + @Override + public void pushDownFilter(QueryBuilder query) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownAggregation(Pair, + OpenSearchAggregationResponseParser> aggregationBuilder) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownSort(List> sortBuilders) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownLimit(Integer limit, Integer offset) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownHighlight(String field, Map arguments) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownProjects(Set projects) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushTypeMapping(Map typeMapping) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java index 8023a86006c..a44a30bf8d5 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java @@ -7,15 +7,22 @@ import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; +import java.util.List; import java.util.Map; import java.util.Set; import lombok.Getter; +import org.apache.commons.lang3.tuple.Pair; import org.opensearch.common.unit.TimeValue; +import org.opensearch.index.query.QueryBuilder; +import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.sort.SortBuilder; +import org.opensearch.sql.ast.expression.Literal; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; /** * This builder assists creating the initial OpenSearch paging (scrolling) request. @@ -55,6 +62,32 @@ public OpenSearchScrollRequest build() { return new OpenSearchScrollRequest(indexName, scrollTimeout, sourceBuilder, exprValueFactory); } + @Override + public void pushDownFilter(QueryBuilder query) { + throw new UnsupportedOperationException("Pagination does not support filter (WHERE clause)"); + } + + @Override + public void pushDownAggregation(Pair, + OpenSearchAggregationResponseParser> aggregationBuilder) { + throw new UnsupportedOperationException("Pagination does not support aggregations"); + } + + @Override + public void pushDownSort(List> sortBuilders) { + throw new UnsupportedOperationException("Pagination does not support sort (ORDER BY clause)"); + } + + @Override + public void pushDownLimit(Integer limit, Integer offset) { + throw new UnsupportedOperationException("Pagination does not support limit (LIMIT clause)"); + } + + @Override + public void pushDownHighlight(String field, Map arguments) { + throw new UnsupportedOperationException("Pagination does not support highlight function"); + } + /** * Push down project expression to OpenSearch. */ diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java index ab1805ce4e8..ce088359c21 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java @@ -27,37 +27,18 @@ default boolean isBoolFilterQuery(QueryBuilder current) { return (current instanceof BoolQueryBuilder); } - private String throwUnsupported(String operation) { - return String.format("%s: push down %s in cursor requests is not supported", - getClass().getSimpleName(), operation); - } + void pushDownFilter(QueryBuilder query); - default void pushDownFilter(QueryBuilder query) { - throw new UnsupportedOperationException(throwUnsupported("filter")); - } + void pushDownAggregation(Pair, + OpenSearchAggregationResponseParser> aggregationBuilder); - default void pushDownAggregation( - Pair, OpenSearchAggregationResponseParser> aggregationBuilder) { - throw new UnsupportedOperationException(throwUnsupported("aggregation")); - } + void pushDownSort(List> sortBuilders); - default void pushDownSort(List> sortBuilders) { - throw new UnsupportedOperationException(throwUnsupported("sort")); - } + void pushDownLimit(Integer limit, Integer offset); - default void pushDownLimit(Integer limit, Integer offset) { - throw new UnsupportedOperationException(throwUnsupported("limit")); - } + void pushDownHighlight(String field, Map arguments); - default void pushDownHighlight(String field, Map arguments) { - throw new UnsupportedOperationException(throwUnsupported("highlight")); - } - - default void pushDownProjects(Set projects) { - throw new UnsupportedOperationException(throwUnsupported("projects")); - } + void pushDownProjects(Set projects); - default void pushTypeMapping(Map typeMapping) { - throw new UnsupportedOperationException(throwUnsupported("type mapping")); - } + void pushTypeMapping(Map typeMapping); } \ No newline at end of file diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java index e449126d1ca..354d6e1d7a7 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java @@ -5,9 +5,13 @@ package org.opensearch.sql.opensearch.request; +import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; +import java.util.Map; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; @@ -54,4 +58,24 @@ public void build() { public void getIndexName() { assertEquals(indexName, requestBuilder.getIndexName()); } + + @Test + public void pushDown_not_supported() { + assertAll( + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownFilter(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownAggregation(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownSort(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownLimit(1, 2)), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownHighlight("", Map.of())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownProjects(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushTypeMapping(mock())) + ); + } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java deleted file mode 100644 index 8112de197ad..00000000000 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilderTest.java +++ /dev/null @@ -1,44 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - - -package org.opensearch.sql.opensearch.request; - -import static org.junit.jupiter.api.Assertions.assertAll; -import static org.junit.jupiter.api.Assertions.assertThrows; -import static org.mockito.Mockito.CALLS_REAL_METHODS; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.withSettings; - -import org.junit.jupiter.api.DisplayNameGeneration; -import org.junit.jupiter.api.DisplayNameGenerator; -import org.junit.jupiter.api.Test; - -@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) -public class PushDownRequestBuilderTest { - - @Test - public void throw_unsupported2() { - var builder = mock(PushDownRequestBuilder.class, - withSettings().defaultAnswer(CALLS_REAL_METHODS)); - - assertAll( - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownFilter(null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownAggregation(null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownSort(null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownLimit(null, null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownHighlight(null, null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushDownProjects(null)), - () -> assertThrows(UnsupportedOperationException.class, () -> - builder.pushTypeMapping(null)) - ); - } -} From b9cb0d0deab19c7cd1ba51d3b65226cf434bbd71 Mon Sep 17 00:00:00 2001 From: Max Ksyunz Date: Thu, 13 Apr 2023 18:33:06 -0700 Subject: [PATCH 07/17] Merge paginated plan optimizer into the regular optimizer. (#1516) Merge paginated plan optimizer into the regular optimizer. --------- Signed-off-by: MaxKsyunz Co-authored-by: Yury-Fridlyand --- .../opensearch/sql/executor/QueryService.java | 12 +--- .../sql/planner/logical/LogicalPlanDSL.java | 4 ++ .../sql/planner/logical/LogicalRelation.java | 6 -- .../optimizer/LogicalPlanOptimizer.java | 24 ------- .../rule/CreatePagingTableScanBuilder.java | 59 +++++++++++----- .../planner/optimizer/rule/PushPageSize.java | 60 ---------------- .../sql/executor/QueryServiceTest.java | 6 +- .../execution/ContinuePaginatedPlanTest.java | 2 +- .../executor/execution/PaginatedPlanTest.java | 5 +- .../optimizer/LogicalPlanOptimizerTest.java | 68 +++++++++++++------ .../CreatePagingTableScanBuilderTest.java | 46 +++++++++++++ .../org/opensearch/sql/storage/TableTest.java | 2 +- .../org/opensearch/sql/ppl/StandaloneIT.java | 3 +- .../opensearch/sql/util/StandaloneModule.java | 3 +- .../plugin/config/OpenSearchPluginModule.java | 3 +- 15 files changed, 146 insertions(+), 157 deletions(-) delete mode 100644 core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java create mode 100644 core/src/test/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilderTest.java diff --git a/core/src/main/java/org/opensearch/sql/executor/QueryService.java b/core/src/main/java/org/opensearch/sql/executor/QueryService.java index 7870b147558..a4cd1982cd3 100644 --- a/core/src/main/java/org/opensearch/sql/executor/QueryService.java +++ b/core/src/main/java/org/opensearch/sql/executor/QueryService.java @@ -15,9 +15,7 @@ import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.planner.PlanContext; import org.opensearch.sql.planner.Planner; -import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; -import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; import org.opensearch.sql.planner.physical.PhysicalPlan; /** @@ -30,15 +28,7 @@ public class QueryService { private final ExecutionEngine executionEngine; - /** - * There are two planners, one - to handle pagination requests (cursor/scroll) only and - * another one for everything else. - * @see OpenSearchPluginModule#queryPlanFactory (:plugin module) - * @see LogicalPlanOptimizer#paginationCreate - * @see QueryService - */ private final Planner planner; - private final Planner paginationPlanner; /** * Execute the {@link UnresolvedPlan}, using {@link ResponseListener} to get response. @@ -115,6 +105,6 @@ public LogicalPlan analyze(UnresolvedPlan plan) { * Translate {@link LogicalPlan} to {@link PhysicalPlan}. */ public PhysicalPlan plan(LogicalPlan plan) { - return plan instanceof LogicalPaginate ? paginationPlanner.plan(plan) : planner.plan(plan); + return planner.plan(plan); } } diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java index a192966287f..c7e1ced92fc 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java @@ -54,6 +54,10 @@ public static LogicalPlan rename( return new LogicalRename(input, renameMap); } + public static LogicalPlan paginate(LogicalPlan input, int fetchSize) { + return new LogicalPaginate(fetchSize, List.of(input)); + } + public static LogicalPlan project(LogicalPlan input, NamedExpression... fields) { return new LogicalProject(input, Arrays.asList(fields), ImmutableList.of()); } diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java index 0ece74690e7..a49c3d5cbe3 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalRelation.java @@ -9,7 +9,6 @@ import com.google.common.collect.ImmutableList; import lombok.EqualsAndHashCode; import lombok.Getter; -import lombok.Setter; import lombok.ToString; import org.opensearch.sql.storage.Table; @@ -26,10 +25,6 @@ public class LogicalRelation extends LogicalPlan { @Getter private final Table table; - @Getter - @Setter - private Integer pageSize; - /** * Constructor of LogicalRelation. */ @@ -37,7 +32,6 @@ public LogicalRelation(String relationName, Table table) { super(ImmutableList.of()); this.relationName = relationName; this.table = table; - this.pageSize = null; } @Override diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java index 13bcfabe74d..f2cd4faf17b 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java @@ -16,7 +16,6 @@ import org.opensearch.sql.planner.optimizer.rule.CreatePagingTableScanBuilder; import org.opensearch.sql.planner.optimizer.rule.MergeFilterAndFilter; import org.opensearch.sql.planner.optimizer.rule.PushFilterUnderSort; -import org.opensearch.sql.planner.optimizer.rule.PushPageSize; import org.opensearch.sql.planner.optimizer.rule.read.CreateTableScanBuilder; import org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown; import org.opensearch.sql.planner.optimizer.rule.write.CreateTableWriteBuilder; @@ -53,29 +52,6 @@ public static LogicalPlanOptimizer create() { * Phase 2: Transformations that rely on data source push down capability */ new CreateTableScanBuilder(), - TableScanPushDown.PUSH_DOWN_FILTER, - TableScanPushDown.PUSH_DOWN_AGGREGATION, - TableScanPushDown.PUSH_DOWN_SORT, - TableScanPushDown.PUSH_DOWN_LIMIT, - TableScanPushDown.PUSH_DOWN_HIGHLIGHT, - TableScanPushDown.PUSH_DOWN_PROJECT, - new CreateTableWriteBuilder())); - } - - /** - * Create {@link LogicalPlanOptimizer} with pre-defined rules. - */ - public static LogicalPlanOptimizer paginationCreate() { - return new LogicalPlanOptimizer(Arrays.asList( - /* - * Phase 1: Transformations that rely on relational algebra equivalence - */ - new MergeFilterAndFilter(), - new PushFilterUnderSort(), - /* - * Phase 2: Transformations that rely on data source push down capability - */ - new PushPageSize(), new CreatePagingTableScanBuilder(), TableScanPushDown.PUSH_DOWN_FILTER, TableScanPushDown.PUSH_DOWN_AGGREGATION, diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java index 22079ed9cac..3785945374d 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java @@ -5,45 +5,68 @@ package org.opensearch.sql.planner.optimizer.rule; -import static org.opensearch.sql.planner.optimizer.pattern.Patterns.table; - -import com.facebook.presto.matching.Capture; import com.facebook.presto.matching.Captures; import com.facebook.presto.matching.Pattern; +import java.util.ArrayDeque; +import java.util.Deque; +import java.util.List; import lombok.Getter; import lombok.experimental.Accessors; +import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalRelation; import org.opensearch.sql.planner.optimizer.Rule; -import org.opensearch.sql.storage.Table; -import org.opensearch.sql.storage.read.TableScanBuilder; /** * Rule to create a paged TableScanBuilder in pagination request. */ -public class CreatePagingTableScanBuilder implements Rule { - /** Capture the table inside matched logical relation operator. */ - private final Capture
capture; - +public class CreatePagingTableScanBuilder implements Rule { + /** Capture the table inside matched logical paginate operator. */ + private LogicalPlan relationParent = null; /** Pattern that matches logical relation operator. */ @Accessors(fluent = true) @Getter - private final Pattern pattern; + private final Pattern pattern; /** * Constructor. */ public CreatePagingTableScanBuilder() { - this.capture = Capture.newCapture(); - this.pattern = Pattern.typeOf(LogicalRelation.class) - .with(table().capturedAs(capture)); + this.pattern = Pattern.typeOf(LogicalPaginate.class).matching(this::findLogicalRelation); + } + + /** + * Finds an instance of LogicalRelation and saves a reference in relationParent variable. + * @param logicalPaginate An instance of LogicalPaginate + * @return true if {@link LogicalRelation} node was found among the descendents of + * {@link this.logicalPaginate}, false otherwise. + */ + private boolean findLogicalRelation(LogicalPaginate logicalPaginate) { + Deque plans = new ArrayDeque<>(); + plans.add(logicalPaginate); + do { + final var plan = plans.removeFirst(); + final var children = plan.getChild(); + if (children.stream().anyMatch(LogicalRelation.class::isInstance)) { + if (children.size() > 1) { + throw new UnsupportedOperationException( + "Unsupported plan: relation operator cannot have siblings"); + } + relationParent = plan; + return true; + } + plans.addAll(children); + } while (!plans.isEmpty()); + return false; } + @Override - public LogicalPlan apply(LogicalRelation plan, Captures captures) { - TableScanBuilder scanBuilder = captures.get(capture) - .createPagedScanBuilder(plan.getPageSize()); - // TODO: Remove this after Prometheus refactored to new table scan builder too - return (scanBuilder == null) ? plan : scanBuilder; + public LogicalPlan apply(LogicalPaginate plan, Captures captures) { + var logicalRelation = (LogicalRelation) relationParent.getChild().get(0); + var scan = logicalRelation.getTable().createPagedScanBuilder(plan.getPageSize()); + relationParent.replaceChildPlans(List.of(scan)); + + return plan; } } diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java deleted file mode 100644 index 95cd23d6ca0..00000000000 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/PushPageSize.java +++ /dev/null @@ -1,60 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.sql.planner.optimizer.rule; - -import static org.opensearch.sql.planner.optimizer.pattern.Patterns.pagination; - -import com.facebook.presto.matching.Capture; -import com.facebook.presto.matching.Captures; -import com.facebook.presto.matching.Pattern; -import lombok.Getter; -import lombok.experimental.Accessors; -import org.opensearch.sql.planner.logical.LogicalPaginate; -import org.opensearch.sql.planner.logical.LogicalPlan; -import org.opensearch.sql.planner.logical.LogicalRelation; -import org.opensearch.sql.planner.optimizer.Rule; - -public class PushPageSize - implements Rule { - /** Capture the table inside matched logical paginate operator. */ - private final Capture capture; - - /** Pattern that matches logical paginate operator. */ - @Accessors(fluent = true) - @Getter - private final Pattern pattern; - - /** - * Constructor. - */ - public PushPageSize() { - this.capture = Capture.newCapture(); - this.pattern = Pattern.typeOf(LogicalPaginate.class) - .with(pagination().capturedAs(capture)); - } - - private LogicalRelation findLogicalRelation(LogicalPlan plan) { //TODO TBD multiple relations? - for (var subplan : plan.getChild()) { - if (subplan instanceof LogicalRelation) { - return (LogicalRelation) subplan; - } - var found = findLogicalRelation(subplan); - if (found != null) { - return found; - } - } - return null; - } - - @Override - public LogicalPlan apply(LogicalPaginate plan, Captures captures) { - var relation = findLogicalRelation(plan); - if (relation != null) { - relation.setPageSize(captures.get(capture)); - } - return plan; - } -} diff --git a/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java b/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java index e3e744d8ec4..525de79afca 100644 --- a/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/QueryServiceTest.java @@ -46,9 +46,6 @@ class QueryServiceTest { @Mock private Planner planner; - @Mock - private Planner paginationPlanner; - @Mock private UnresolvedPlan ast; @@ -120,9 +117,8 @@ class Helper { public Helper() { lenient().when(analyzer.analyze(any(), any())).thenReturn(logicalPlan); lenient().when(planner.plan(any())).thenReturn(plan); - lenient().when(paginationPlanner.plan(any())).thenReturn(plan); - queryService = new QueryService(analyzer, executionEngine, planner, paginationPlanner); + queryService = new QueryService(analyzer, executionEngine, planner); } Helper executeSuccess() { diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java index e0289872b7a..1e5cb0b214a 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java @@ -46,7 +46,7 @@ public static void setUp() { when(storageEngine.getTableScan(anyString(), anyString())) .thenReturn(mock(TableScanOperator.class)); planSerializer = new PlanSerializer(storageEngine); - queryService = new QueryService(null, new DefaultExecutionEngine(), null, null); + queryService = new QueryService(null, new DefaultExecutionEngine(), null); } @Test diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java index ac1a3fb7e83..495dcbb050e 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java @@ -26,7 +26,6 @@ import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.planner.Planner; import org.opensearch.sql.planner.logical.LogicalPaginate; -import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlan; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @@ -43,7 +42,7 @@ public static void setUp() { when(analyzer.analyze(any(), any())).thenReturn(mock(LogicalPaginate.class)); var planner = mock(Planner.class); when(planner.plan(any())).thenReturn(mock(PhysicalPlan.class)); - queryService = new QueryService(analyzer, new DefaultExecutionEngine(), null, planner); + queryService = new QueryService(analyzer, new DefaultExecutionEngine(), planner); } @Test @@ -79,7 +78,7 @@ public void onFailure(Exception e) { } }; var plan = new PaginatedPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, - new QueryService(null, new DefaultExecutionEngine(), null, null), listener); + new QueryService(null, new DefaultExecutionEngine(), null), listener); plan.execute(); } diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java index 1ee9b9aa3b1..aae05f9da41 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java @@ -7,11 +7,12 @@ package org.opensearch.sql.planner.optimizer; import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyInt; import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.opensearch.sql.data.model.ExprValueUtils.integerValue; import static org.opensearch.sql.data.model.ExprValueUtils.longValue; @@ -21,6 +22,7 @@ import static org.opensearch.sql.planner.logical.LogicalPlanDSL.filter; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.highlight; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.limit; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.paginate; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.project; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.relation; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.sort; @@ -47,6 +49,8 @@ import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalRelation; +import org.opensearch.sql.planner.optimizer.rule.CreatePagingTableScanBuilder; +import org.opensearch.sql.planner.optimizer.rule.read.CreateTableScanBuilder; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.Table; import org.opensearch.sql.storage.read.TableScanBuilder; @@ -62,9 +66,13 @@ class LogicalPlanOptimizerTest { @Spy private TableScanBuilder tableScanBuilder; + @Spy + private TableScanBuilder pagedTableScanBuilder; + @BeforeEach void setUp() { lenient().when(table.createScanBuilder()).thenReturn(tableScanBuilder); + lenient().when(table.createPagedScanBuilder(anyInt())).thenReturn(pagedTableScanBuilder); } /** @@ -313,48 +321,64 @@ public PhysicalPlan implement(LogicalPlan plan) { @Test void paged_table_scan_builder_support_project_push_down_can_apply_its_rule() { - when(tableScanBuilder.pushDownProject(any())).thenReturn(true); - when(table.createPagedScanBuilder(anyInt())).thenReturn(tableScanBuilder); - var relation = new LogicalRelation("schema", table); - relation.setPageSize(anyInt()); + var relation = relation("schema", table); assertEquals( - tableScanBuilder, - LogicalPlanOptimizer.paginationCreate().optimize(project(relation)) - ); + paginate(project(pagedTableScanBuilder), 4), + LogicalPlanOptimizer.create().optimize(paginate(project(relation), 4))); } - @Test - void push_page_size() { - var relation = new LogicalRelation("schema", table); - var paginate = new LogicalPaginate(42, List.of(project(relation))); - assertNull(relation.getPageSize()); - LogicalPlanOptimizer.paginationCreate().optimize(paginate); - assertEquals(42, relation.getPageSize()); - } @Test void push_page_size_noop_if_no_relation() { var paginate = new LogicalPaginate(42, List.of(project(values()))); - LogicalPlanOptimizer.paginationCreate().optimize(paginate); + assertEquals(paginate, LogicalPlanOptimizer.create().optimize(paginate)); + } + + @Test + void pagination_optimizer_simple_query() { + var projectPlan = project(relation("schema", table), DSL.named(DSL.ref("intV", INTEGER))); + + var optimizer = new LogicalPlanOptimizer( + List.of(new CreateTableScanBuilder(), new CreatePagingTableScanBuilder())); + + { + optimizer.optimize(projectPlan); + verify(table).createScanBuilder(); + verify(table, never()).createPagedScanBuilder(anyInt()); + } + } + + @Test + void pagination_optimizer_paged_query() { + var relation = new LogicalRelation("schema", table); + var projectPlan = project(relation, DSL.named(DSL.ref("intV", INTEGER))); + var pagedPlan = new LogicalPaginate(10, List.of(projectPlan)); + + var optimizer = new LogicalPlanOptimizer( + List.of(new CreateTableScanBuilder(), new CreatePagingTableScanBuilder())); + var optimized = optimizer.optimize(pagedPlan); + verify(table).createPagedScanBuilder(anyInt()); } @Test void push_page_size_noop_if_no_sub_plans() { var paginate = new LogicalPaginate(42, List.of()); - LogicalPlanOptimizer.paginationCreate().optimize(paginate); + assertEquals(paginate, + LogicalPlanOptimizer.create().optimize(paginate)); } @Test void table_scan_builder_support_offset_push_down_can_apply_its_rule() { - when(table.createPagedScanBuilder(anyInt())).thenReturn(tableScanBuilder); + when(table.createPagedScanBuilder(anyInt())).thenReturn(pagedTableScanBuilder); - var optimized = LogicalPlanOptimizer.paginationCreate() - .optimize(new LogicalPaginate(42, List.of(project(relation("schema", table))))); + var relation = new LogicalRelation("schema", table); + var optimized = LogicalPlanOptimizer.create() + .optimize(new LogicalPaginate(42, List.of(project(relation)))); // `optimized` structure: LogicalPaginate -> LogicalProject -> TableScanBuilder // LogicalRelation replaced by a TableScanBuilder instance - assertEquals(tableScanBuilder, optimized.getChild().get(0).getChild().get(0)); + assertEquals(paginate(project(pagedTableScanBuilder), 42), optimized); } private LogicalPlan optimize(LogicalPlan plan) { diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilderTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilderTest.java new file mode 100644 index 00000000000..79c7b55c60b --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilderTest.java @@ -0,0 +1,46 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.optimizer.rule; + +import static com.facebook.presto.matching.DefaultMatcher.DEFAULT_MATCHER; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.when; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.paginate; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.relation; + +import java.util.List; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.planner.logical.LogicalPlan; +import org.opensearch.sql.storage.Table; + +@ExtendWith(MockitoExtension.class) +class CreatePagingTableScanBuilderTest { + + @Mock + LogicalPlan multiRelationPaginate; + + @Mock + Table table; + + @BeforeEach + public void setUp() { + when(multiRelationPaginate.getChild()) + .thenReturn( + List.of(relation("t1", table), relation("t2", table))); + } + + @Test + void throws_when_mutliple_children() { + final var pattern = new CreatePagingTableScanBuilder().pattern(); + final var plan = paginate(multiRelationPaginate, 42); + assertThrows(UnsupportedOperationException.class, + () -> DEFAULT_MATCHER.match(pattern, plan)); + } +} diff --git a/core/src/test/java/org/opensearch/sql/storage/TableTest.java b/core/src/test/java/org/opensearch/sql/storage/TableTest.java index 2a2b5550145..a96ee71af0b 100644 --- a/core/src/test/java/org/opensearch/sql/storage/TableTest.java +++ b/core/src/test/java/org/opensearch/sql/storage/TableTest.java @@ -20,6 +20,6 @@ public class TableTest { @Test public void createPagedScanBuilder_throws() { var table = mock(Table.class, withSettings().defaultAnswer(InvocationOnMock::callRealMethod)); - assertThrows(Throwable.class, () -> table.createPagedScanBuilder(0)); + assertThrows(Throwable.class, () -> table.createPagedScanBuilder(4)); } } diff --git a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java index 8664c7c9484..74613ee5b16 100644 --- a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java @@ -240,8 +240,7 @@ public QueryPlanFactory queryPlanFactory(ExecutionEngine executionEngine, new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); - Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); - QueryService queryService = new QueryService(analyzer, executionEngine, planner, paginationPlanner); + QueryService queryService = new QueryService(analyzer, executionEngine, planner); return new QueryPlanFactory(queryService, planSerializer); } } diff --git a/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java index a5d8a19aeb2..a86f2513771 100644 --- a/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java +++ b/integ-test/src/test/java/org/opensearch/sql/util/StandaloneModule.java @@ -117,7 +117,6 @@ public QueryService queryService(ExecutionEngine executionEngine) { new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); - Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); - return new QueryService(analyzer, executionEngine, planner, paginationPlanner); + return new QueryService(analyzer, executionEngine, planner); } } diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java index 7461b7df3b1..b80cb3faab5 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/config/OpenSearchPluginModule.java @@ -106,9 +106,8 @@ public QueryPlanFactory queryPlanFactory(DataSourceService dataSourceService, new Analyzer( new ExpressionAnalyzer(functionRepository), dataSourceService, functionRepository); Planner planner = new Planner(LogicalPlanOptimizer.create()); - Planner paginationPlanner = new Planner(LogicalPlanOptimizer.paginationCreate()); QueryService queryService = new QueryService( - analyzer, executionEngine, planner, paginationPlanner); + analyzer, executionEngine, planner); return new QueryPlanFactory(queryService, planSerializer); } } From 9a1a17cfbe5bf45a1a1776ee621bd81064b6244f Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Fri, 14 Apr 2023 15:38:02 -0700 Subject: [PATCH 08/17] Complete rework on serialization and deserialization. (#1498) Signed-off-by: Yury-Fridlyand --- .../sql/exception/NoCursorException.java | 13 + .../sql/executor/pagination/Cursor.java | 16 +- .../executor/pagination/PlanSerializer.java | 176 +++---- .../sql/planner/SerializablePlan.java | 70 +++ .../planner/physical/PaginateOperator.java | 32 +- .../sql/planner/physical/PhysicalPlan.java | 23 +- .../sql/planner/physical/ProjectOperator.java | 45 +- .../opensearch/sql/storage/StorageEngine.java | 7 - .../execution/ContinuePaginatedPlanTest.java | 17 +- .../sql/executor/pagination/CursorTest.java | 4 +- .../pagination/PlanSerializerTest.java | 485 +++++------------- .../sql/planner/SerializablePlanTest.java | 39 ++ .../physical/PaginateOperatorTest.java | 17 +- .../planner/physical/PhysicalPlanTest.java | 15 - .../planner/physical/ProjectOperatorTest.java | 72 ++- .../sql/storage/StorageEngineTest.java | 7 - .../value/OpenSearchExprValueFactory.java | 3 +- .../protector/ResourceMonitorPlan.java | 8 +- .../request/ContinuePageRequestBuilder.java | 1 + .../request/InitialPageRequestBuilder.java | 4 +- .../setting/OpenSearchSettings.java | 4 +- .../storage/OpenSearchStorageEngine.java | 20 +- ...OpenSearchIndexScanAggregationBuilder.java | 2 +- .../scan/OpenSearchIndexScanQueryBuilder.java | 2 +- .../scan/OpenSearchPagedIndexScan.java | 53 +- .../script/ExpressionScriptEngine.java | 2 +- .../aggregation/AggregationQueryBuilder.java | 2 +- .../dsl/AggregationBuilderHelper.java | 2 +- .../dsl/BucketAggregationBuilder.java | 2 +- .../dsl/MetricAggregationBuilder.java | 2 +- .../script/filter/FilterQueryBuilder.java | 2 +- .../DefaultExpressionSerializer.java | 2 +- .../serialization/ExpressionSerializer.java | 2 +- .../OpenSearchExecutionEngineTest.java | 19 +- .../executor/ResourceMonitorPlanTest.java | 10 +- .../storage/OpenSearchStorageEngineTest.java | 25 - .../scan/OpenSearchPagedIndexScanTest.java | 71 ++- .../script/ExpressionScriptEngineTest.java | 2 +- .../AggregationQueryBuilderTest.java | 2 +- .../dsl/BucketAggregationBuilderTest.java | 2 +- .../dsl/MetricAggregationBuilderTest.java | 2 +- .../script/filter/FilterQueryBuilderTest.java | 2 +- .../DefaultExpressionSerializerTest.java | 2 - .../org/opensearch/sql/plugin/SQLPlugin.java | 2 +- .../format/JdbcResponseFormatterTest.java | 2 +- 45 files changed, 607 insertions(+), 685 deletions(-) create mode 100644 core/src/main/java/org/opensearch/sql/exception/NoCursorException.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java create mode 100644 core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java rename {core/src/main/java/org/opensearch/sql/expression => opensearch/src/main/java/org/opensearch/sql/opensearch/storage}/serialization/DefaultExpressionSerializer.java (95%) rename {core/src/main/java/org/opensearch/sql/expression => opensearch/src/main/java/org/opensearch/sql/opensearch/storage}/serialization/ExpressionSerializer.java (90%) rename {core/src/test/java/org/opensearch/sql/expression => opensearch/src/test/java/org/opensearch/sql/opensearch/storage}/serialization/DefaultExpressionSerializerTest.java (94%) diff --git a/core/src/main/java/org/opensearch/sql/exception/NoCursorException.java b/core/src/main/java/org/opensearch/sql/exception/NoCursorException.java new file mode 100644 index 00000000000..9383bece573 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/exception/NoCursorException.java @@ -0,0 +1,13 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.exception; + +/** + * This should be thrown on serialization of a PhysicalPlan tree if paging is finished. + * Processing of such exception should outcome of responding no cursor to the user. + */ +public class NoCursorException extends RuntimeException { +} diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java b/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java index 0339bec9cad..bb320f5c670 100644 --- a/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/Cursor.java @@ -7,23 +7,17 @@ import lombok.EqualsAndHashCode; import lombok.Getter; +import lombok.RequiredArgsConstructor; @EqualsAndHashCode +@RequiredArgsConstructor public class Cursor { - public static final Cursor None = new Cursor(); + public static final Cursor None = new Cursor(null); @Getter - private final byte[] raw; - - private Cursor() { - raw = new byte[] {}; - } - - public Cursor(byte[] raw) { - this.raw = raw; - } + private final String data; public String toString() { - return new String(raw); + return data; } } diff --git a/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java b/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java index d9915e2b8dc..d6d10ee89cf 100644 --- a/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java +++ b/core/src/main/java/org/opensearch/sql/executor/pagination/PlanSerializer.java @@ -9,19 +9,20 @@ import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; import java.io.IOException; -import java.util.ArrayList; -import java.util.List; +import java.io.InputStream; +import java.io.NotSerializableException; +import java.io.ObjectInputStream; +import java.io.ObjectOutputStream; +import java.io.Serializable; +import java.util.zip.Deflater; import java.util.zip.GZIPInputStream; import java.util.zip.GZIPOutputStream; import lombok.RequiredArgsConstructor; import org.opensearch.sql.ast.tree.UnresolvedPlan; -import org.opensearch.sql.expression.NamedExpression; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; -import org.opensearch.sql.planner.physical.PaginateOperator; +import org.opensearch.sql.exception.NoCursorException; +import org.opensearch.sql.planner.SerializablePlan; import org.opensearch.sql.planner.physical.PhysicalPlan; -import org.opensearch.sql.planner.physical.ProjectOperator; import org.opensearch.sql.storage.StorageEngine; -import org.opensearch.sql.storage.TableScanOperator; /** * This class is entry point to paged requests. It is responsible to cursor serialization @@ -30,132 +31,101 @@ @RequiredArgsConstructor public class PlanSerializer { public static final String CURSOR_PREFIX = "n:"; - private final StorageEngine storageEngine; + + private final StorageEngine engine; public boolean canConvertToCursor(UnresolvedPlan plan) { return plan.accept(new CanPaginateVisitor(), null); } /** - * Converts a physical plan tree to a cursor. May cache plan related data somewhere. + * Converts a physical plan tree to a cursor. */ - public Cursor convertToCursor(PhysicalPlan plan) throws IOException { - if (plan instanceof PaginateOperator) { - var cursor = plan.toCursor(); - if (cursor == null) { - return Cursor.None; - } - var raw = CURSOR_PREFIX + compress(cursor); - return new Cursor(raw.getBytes()); + public Cursor convertToCursor(PhysicalPlan plan) { + try { + return new Cursor(CURSOR_PREFIX + + serialize(((SerializablePlan) plan).getPlanForSerialization())); + // ClassCastException thrown when a plan in the tree doesn't implement SerializablePlan + } catch (NotSerializableException | ClassCastException | NoCursorException e) { + return Cursor.None; } - return Cursor.None; } /** - * Compress serialized query plan. - * @param str string representing a query plan - * @return str compressed with gzip. + * Serializes and compresses the object. + * @param object The object. + * @return Encoded binary data. */ - String compress(String str) throws IOException { - if (str == null || str.length() == 0) { - return ""; + protected String serialize(Serializable object) throws NotSerializableException { + try { + ByteArrayOutputStream output = new ByteArrayOutputStream(); + ObjectOutputStream objectOutput = new ObjectOutputStream(output); + objectOutput.writeObject(object); + objectOutput.flush(); + + ByteArrayOutputStream out = new ByteArrayOutputStream(); + // GZIP provides 35-45%, lzma from apache commons-compress has few % better compression + GZIPOutputStream gzip = new GZIPOutputStream(out) { { + this.def.setLevel(Deflater.BEST_COMPRESSION); + } }; + gzip.write(output.toByteArray()); + gzip.close(); + + return HashCode.fromBytes(out.toByteArray()).toString(); + } catch (NotSerializableException e) { + throw e; + } catch (IOException e) { + throw new IllegalStateException("Failed to serialize: " + object, e); } - ByteArrayOutputStream out = new ByteArrayOutputStream(); - GZIPOutputStream gzip = new GZIPOutputStream(out); - gzip.write(str.getBytes()); - gzip.close(); - return HashCode.fromBytes(out.toByteArray()).toString(); } /** - * Decompresses a query plan that was compress with {@link PlanSerializer#compress}. - * @param input compressed query plan - * @return decompressed string + * Decompresses and deserializes the binary data. + * @param code Encoded binary data. + * @return An object. */ - String decompress(String input) throws IOException { - if (input == null || input.length() == 0) { - return ""; + protected Serializable deserialize(String code) { + try { + GZIPInputStream gzip = new GZIPInputStream( + new ByteArrayInputStream(HashCode.fromString(code).asBytes())); + ObjectInputStream objectInput = new CursorDeserializationStream( + new ByteArrayInputStream(gzip.readAllBytes())); + return (Serializable) objectInput.readObject(); + } catch (Exception e) { + throw new IllegalStateException("Failed to deserialize object", e); } - GZIPInputStream gzip = new GZIPInputStream(new ByteArrayInputStream( - HashCode.fromString(input).asBytes())); - return new String(gzip.readAllBytes()); } /** - * Parse `NamedExpression`s from cursor. - * @param listToFill List to fill with data. - * @param cursor Cursor to parse. - * @return Remaining part of the cursor. + * Converts a cursor to a physical plan tree. */ - private String parseNamedExpressions(List listToFill, String cursor) { - var serializer = new DefaultExpressionSerializer(); - if (cursor.startsWith(")")) { //empty list - return cursor.substring(cursor.indexOf(',') + 1); - } - while (!cursor.startsWith("(")) { - listToFill.add((NamedExpression) - serializer.deserialize(cursor.substring(0, - Math.min(cursor.indexOf(','), cursor.indexOf(')'))))); - cursor = cursor.substring(cursor.indexOf(',') + 1); - } - return cursor; - } - - /** - * Converts a cursor to a physical plan tree. - */ public PhysicalPlan convertToPlan(String cursor) { if (!cursor.startsWith(CURSOR_PREFIX)) { throw new UnsupportedOperationException("Unsupported cursor"); } try { - cursor = cursor.substring(CURSOR_PREFIX.length()); - cursor = decompress(cursor); - - // TODO Parse with ANTLR or serialize as JSON/XML - if (!cursor.startsWith("(Paginate,")) { - throw new UnsupportedOperationException("Unsupported cursor"); - } - // TODO add checks for > 0 - cursor = cursor.substring(cursor.indexOf(',') + 1); - final int currentPageIndex = Integer.parseInt(cursor, 0, cursor.indexOf(','), 10); - - cursor = cursor.substring(cursor.indexOf(',') + 1); - final int pageSize = Integer.parseInt(cursor, 0, cursor.indexOf(','), 10); - - cursor = cursor.substring(cursor.indexOf(',') + 1); - if (!cursor.startsWith("(Project,")) { - throw new UnsupportedOperationException("Unsupported cursor"); - } - cursor = cursor.substring(cursor.indexOf(',') + 1); - if (!cursor.startsWith("(namedParseExpressions,")) { - throw new UnsupportedOperationException("Unsupported cursor"); - } - - cursor = cursor.substring(cursor.indexOf(',') + 1); - List namedParseExpressions = new ArrayList<>(); - cursor = parseNamedExpressions(namedParseExpressions, cursor); + return (PhysicalPlan) deserialize(cursor.substring(CURSOR_PREFIX.length())); + } catch (Exception e) { + throw new UnsupportedOperationException("Unsupported cursor", e); + } + } - List projectList = new ArrayList<>(); - if (!cursor.startsWith("(projectList,")) { - throw new UnsupportedOperationException("Unsupported cursor"); - } - cursor = cursor.substring(cursor.indexOf(',') + 1); - cursor = parseNamedExpressions(projectList, cursor); + /** + * This function is used in testing only, to get access to {@link CursorDeserializationStream}. + */ + public CursorDeserializationStream getCursorDeserializationStream(InputStream in) + throws IOException { + return new CursorDeserializationStream(in); + } - if (!cursor.startsWith("(OpenSearchPagedIndexScan,")) { - throw new UnsupportedOperationException("Unsupported cursor"); - } - cursor = cursor.substring(cursor.indexOf(',') + 1); - var indexName = cursor.substring(0, cursor.indexOf(',')); - cursor = cursor.substring(cursor.indexOf(',') + 1); - var scrollId = cursor.substring(0, cursor.indexOf(')')); - TableScanOperator scan = storageEngine.getTableScan(indexName, scrollId); + public class CursorDeserializationStream extends ObjectInputStream { + public CursorDeserializationStream(InputStream in) throws IOException { + super(in); + } - return new PaginateOperator(new ProjectOperator(scan, projectList, namedParseExpressions), - pageSize, currentPageIndex); - } catch (Exception e) { - throw new UnsupportedOperationException("Unsupported cursor", e); + @Override + public Object resolveObject(Object obj) throws IOException { + return obj.equals("engine") ? engine : obj; } } } diff --git a/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java b/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java new file mode 100644 index 00000000000..220408b67d5 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java @@ -0,0 +1,70 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner; + +import java.io.Externalizable; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; +import org.apache.commons.lang3.NotImplementedException; +import org.opensearch.sql.executor.pagination.PlanSerializer; + +/** + * All subtypes of PhysicalPlan which needs to be serialized (in cursor, for pagination feature) + * should follow one of the following options. + *
    + *
  • Both: + *
      + *
    • Override both methods from {@link Externalizable}.
    • + *
    • Define a public no-arg constructor.
    • + *
    + *
  • + *
  • + * Overwrite {@link #getPlanForSerialization} to return + * another instance of {@link SerializablePlan}. + *
  • + *
+ */ +public interface SerializablePlan extends Externalizable { + + /** + * Argument is an instance of {@link PlanSerializer.CursorDeserializationStream}. + */ + @Override + default void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + throw new NotImplementedException(String.format("`readExternal` is not implemented in %s", + getClass().getSimpleName())); + } + + /** + * Each plan which has as a child plan should do. + *
{@code
+   * out.writeObject(input.getPlanForSerialization());
+   * }
+ */ + @Override + default void writeExternal(ObjectOutput out) throws IOException { + throw new NotImplementedException(String.format("`readExternal` is not implemented in %s", + getClass().getSimpleName())); + } + + /** + * Override to return child or delegated plan, so parent plan should skip this one + * for serialization, but it should try to serialize grandchild plan. + * Imagine plan structure like this + *
+   *    A         -> this
+   *    `- B      -> child
+   *      `- C    -> this
+   * 
+ * In that case only plans A and C should be attempted to serialize. + * It is needed to skip a `ResourceMonitorPlan` instance only, actually. + * @return Next plan for serialization. + */ + default SerializablePlan getPlanForSerialization() { + return this; + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java index 97901def0fe..7601f7006aa 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java @@ -11,13 +11,11 @@ import lombok.RequiredArgsConstructor; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.planner.physical.PhysicalPlan; -import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; -import org.opensearch.sql.planner.physical.ProjectOperator; +import org.opensearch.sql.planner.SerializablePlan; -@RequiredArgsConstructor @EqualsAndHashCode(callSuper = false) -public class PaginateOperator extends PhysicalPlan { +@RequiredArgsConstructor +public class PaginateOperator extends PhysicalPlan implements SerializablePlan { @Getter private final PhysicalPlan input; @@ -30,17 +28,17 @@ public class PaginateOperator extends PhysicalPlan { * See usage. */ @Getter - private final int pageIndex; + private int pageIndex = 0; - int numReturned = 0; + private int numReturned = 0; /** - * Page given physical plan, with pageSize elements per page, starting with the first page. + * Page given physical plan, with pageSize elements per page, starting with the given page. */ - public PaginateOperator(PhysicalPlan input, int pageSize) { + public PaginateOperator(PhysicalPlan input, int pageSize, int pageIndex) { this.pageSize = pageSize; this.input = input; - this.pageIndex = 0; + this.pageIndex = pageIndex; } @Override @@ -68,17 +66,9 @@ public ExecutionEngine.Schema schema() { return input.schema(); } + /** No need to serialize a PaginateOperator, it actually does nothing - it is a wrapper. */ @Override - public String toCursor() { - // Save cursor to read the next page. - // Could process node.getChild() here with another visitor -- one that saves the - // parameters for other physical operators -- ProjectOperator, etc. - // cursor format: n:|" - String child = getChild().get(0).toCursor(); - - var nextPage = getPageIndex() + 1; - return child == null || child.isEmpty() - ? null : createSection("Paginate", Integer.toString(nextPage), - Integer.toString(getPageSize()), child); + public SerializablePlan getPlanForSerialization() { + return (SerializablePlan) input; } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java index 312e4bfff9a..b4547a63b06 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlan.java @@ -7,7 +7,6 @@ package org.opensearch.sql.planner.physical; import java.util.Iterator; -import java.util.List; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.planner.PlanNode; @@ -16,9 +15,8 @@ /** * Physical plan. */ -public abstract class PhysicalPlan implements PlanNode, - Iterator, - AutoCloseable { +public abstract class PhysicalPlan + implements PlanNode, Iterator, AutoCloseable { /** * Accept the {@link PhysicalPlanNodeVisitor}. * @@ -57,21 +55,4 @@ public ExecutionEngine.Schema schema() { public long getTotalHits() { return getChild().stream().mapToLong(PhysicalPlan::getTotalHits).max().orElse(0); } - - public String toCursor() { - throw new IllegalStateException(String.format("%s is not compatible with cursor feature", - this.getClass().getSimpleName())); - } - - /** - * Creates an S-expression that represents a plan node. - * @param plan Label for the plan. - * @param params List of serialized parameters. Including the child plans. - * @return A string that represents the plan called with those parameters. - */ - protected String createSection(String plan, String... params) { - return "(" + plan + "," - + String.join(",", params) - + ")"; - } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java index c61b35e0cb6..1699c97c153 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/ProjectOperator.java @@ -8,13 +8,16 @@ import com.google.common.collect.ImmutableMap; import com.google.common.collect.ImmutableMap.Builder; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; import java.util.Collections; import java.util.List; import java.util.Optional; import java.util.stream.Collectors; +import lombok.AllArgsConstructor; import lombok.EqualsAndHashCode; import lombok.Getter; -import lombok.RequiredArgsConstructor; import lombok.ToString; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; @@ -22,21 +25,21 @@ import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.parse.ParseExpression; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; +import org.opensearch.sql.planner.SerializablePlan; /** * Project the fields specified in {@link ProjectOperator#projectList} from input. */ @ToString @EqualsAndHashCode(callSuper = false) -@RequiredArgsConstructor -public class ProjectOperator extends PhysicalPlan { +@AllArgsConstructor +public class ProjectOperator extends PhysicalPlan implements SerializablePlan { @Getter - private final PhysicalPlan input; + private PhysicalPlan input; @Getter - private final List projectList; + private List projectList; @Getter - private final List namedParseExpressions; + private List namedParseExpressions; @Override public R accept(PhysicalPlanNodeVisitor visitor, C context) { @@ -96,17 +99,23 @@ public ExecutionEngine.Schema schema() { expr.getAlias(), expr.type())).collect(Collectors.toList())); } + /** Don't use, it is for deserialization needs only. */ + @Deprecated + public ProjectOperator() { + } + + @SuppressWarnings("unchecked") @Override - public String toCursor() { - String child = getChild().get(0).toCursor(); - if (child == null || child.isEmpty()) { - return null; - } - var serializer = new DefaultExpressionSerializer(); - String projects = createSection("projectList", - projectList.stream().map(serializer::serialize).toArray(String[]::new)); - String namedExpressions = createSection("namedParseExpressions", - namedParseExpressions.stream().map(serializer::serialize).toArray(String[]::new)); - return createSection("Project", namedExpressions, projects, child); + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + projectList = (List) in.readObject(); + // note: namedParseExpressions aren't serialized and deserialized + namedParseExpressions = List.of(); + input = (PhysicalPlan) in.readObject(); + } + + @Override + public void writeExternal(ObjectOutput out) throws IOException { + out.writeObject(projectList); + out.writeObject(((SerializablePlan) input).getPlanForSerialization()); } } diff --git a/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java b/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java index 18e9e92886c..ffcc0911dee 100644 --- a/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java +++ b/core/src/main/java/org/opensearch/sql/storage/StorageEngine.java @@ -8,9 +8,7 @@ import java.util.Collection; import java.util.Collections; -import java.util.List; import org.opensearch.sql.DataSourceSchemaName; -import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.function.FunctionResolver; /** @@ -31,9 +29,4 @@ public interface StorageEngine { default Collection getFunctions() { return Collections.emptyList(); } - - default TableScanOperator getTableScan(String indexName, String scrollId) { - String error = String.format("%s.getTableScan needs to be implemented", getClass()); - throw new UnsupportedOperationException(error); - } } diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java index 1e5cb0b214a..3e08280acbe 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlanTest.java @@ -14,9 +14,7 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.mockito.Mockito.withSettings; -import static org.opensearch.sql.executor.pagination.PlanSerializerTest.buildCursor; -import java.util.Map; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; @@ -27,8 +25,8 @@ import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.pagination.PlanSerializer; +import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.StorageEngine; -import org.opensearch.sql.storage.TableScanOperator; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class ContinuePaginatedPlanTest { @@ -43,14 +41,14 @@ public class ContinuePaginatedPlanTest { @BeforeAll public static void setUp() { var storageEngine = mock(StorageEngine.class); - when(storageEngine.getTableScan(anyString(), anyString())) - .thenReturn(mock(TableScanOperator.class)); planSerializer = new PlanSerializer(storageEngine); queryService = new QueryService(null, new DefaultExecutionEngine(), null); } @Test public void can_execute_plan() { + var planSerializer = mock(PlanSerializer.class); + when(planSerializer.convertToPlan(anyString())).thenReturn(mock(PhysicalPlan.class)); var listener = new ResponseListener() { @Override public void onResponse(ExecutionEngine.QueryResponse response) { @@ -59,16 +57,15 @@ public void onResponse(ExecutionEngine.QueryResponse response) { @Override public void onFailure(Exception e) { - fail(); + fail(e); } }; - var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of()), + var plan = new ContinuePaginatedPlan(QueryId.queryId(), "", queryService, planSerializer, listener); plan.execute(); } @Test - // Same as previous test, but with malformed cursor public void can_handle_error_while_executing_plan() { var listener = new ResponseListener() { @Override @@ -81,8 +78,8 @@ public void onFailure(Exception e) { assertNotNull(e); } }; - var plan = new ContinuePaginatedPlan(QueryId.queryId(), buildCursor(Map.of("pageSize", "abc")), - queryService, planSerializer, listener); + var plan = new ContinuePaginatedPlan(QueryId.queryId(), "", queryService, + planSerializer, listener); plan.execute(); } diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java index ff5e0d37a72..e3e2c8cf333 100644 --- a/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/CursorTest.java @@ -16,12 +16,12 @@ class CursorTest { @Test void empty_array_is_none() { - Assertions.assertEquals(Cursor.None, new Cursor(new byte[]{})); + Assertions.assertEquals(Cursor.None, new Cursor(null)); } @Test void toString_is_array_value() { String cursorTxt = "This is a test"; - Assertions.assertEquals(cursorTxt, new Cursor(cursorTxt.getBytes()).toString()); + Assertions.assertEquals(cursorTxt, new Cursor(cursorTxt).toString()); } } diff --git a/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java b/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java index 7db431ed910..b1e97920c89 100644 --- a/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/pagination/PlanSerializerTest.java @@ -7,32 +7,35 @@ import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNotSame; +import static org.junit.jupiter.api.Assertions.assertSame; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; -import java.util.Map; -import java.util.stream.Stream; -import java.util.zip.GZIPOutputStream; +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; +import java.io.ObjectOutputStream; +import java.io.Serializable; +import java.util.List; import lombok.SneakyThrows; -import org.apache.commons.lang3.reflect.FieldUtils; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest; -import org.junit.jupiter.params.provider.Arguments; -import org.junit.jupiter.params.provider.MethodSource; import org.junit.jupiter.params.provider.ValueSource; -import org.mockito.Mockito; import org.opensearch.sql.ast.dsl.AstDSL; import org.opensearch.sql.data.model.ExprValue; -import org.opensearch.sql.planner.physical.PaginateOperator; +import org.opensearch.sql.exception.NoCursorException; +import org.opensearch.sql.planner.SerializablePlan; +import org.opensearch.sql.planner.physical.PhysicalPlan; +import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; import org.opensearch.sql.storage.StorageEngine; -import org.opensearch.sql.storage.TableScanOperator; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class PlanSerializerTest { @@ -41,240 +44,9 @@ public class PlanSerializerTest { PlanSerializer planCache; - // encoded query 'select * from cacls' o_O - static final String testCursor = "(Paginate,1,2,(Project," - + "(namedParseExpressions,),(projectList,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5" - + "OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVk" - + "dAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZ" - + "y5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH" - + "4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHl" - + "wZS9FeHByVHlwZTt4cHQABWJvb2wzc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFh" - + "dAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAf" - + "gAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS" - + "5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZX" - + "hwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAA" - + "JZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgAB" - + "eHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA" - + "0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3" - + "FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABGludDBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYg" - + "G0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAA" - + "eHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAA" - + "HhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlY" - + "XJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy" - + "9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAA" - + "EbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274" - + "AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZ" - + "W5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABXRpbWUxc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYX" - + "lMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzu" - + "t0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBl" - + "AAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNy" - + "AC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzd" - + "AASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0" - + "V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5" - + "jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5" - + "cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWJvb2wyc3IAGmphdmEudXRpb" - + "C5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS" - + "5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGU" - + "uRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEA" - + "fgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIA" - + "A0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9le" - + "HByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW" - + "9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9" - + "MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABGludDJzcgAa" - + "amF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1c" - + "gATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLm" - + "RhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAd" - + "JTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb27" - + "4hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2Vh" - + "cmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxb" - + "C5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTG" - + "phdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQ" - + "ABGludDFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9P" - + "YmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZ" - + "WFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAA" - + "AAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZE" - + "V4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9" - + "yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVu" - + "c2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwAB" - + "XBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeH" - + "ByVHlwZTt4cHQABHN0cjNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGp" - + "hdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgAp" - + "b3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuR" - + "W51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc" - + "2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZW" - + "dhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3I" - + "AMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0" - + "dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2Rhd" - + "GEvdHlwZS9FeHByVHlwZTt4cHQABGludDNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAV" - + "sAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAA" - + "BcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5q" - + "YXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAdJTlRFR0VScQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5z" - + "cWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpb" - + "mc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZX" - + "EAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWv" - + "MkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFy" - + "Y2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZp" - + "Dy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHX" - + "tHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAA" - + "AABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1vcmcub3B" - + "lbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEv" - + "bGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb" - + "247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3" - + "Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3J" - + "nL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjJzcgAaamF2YS51dGlsLkFycmF5cyRB" - + "cnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3Rya" - + "W5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZV" - + "R5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABX" - + "NyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWF" - + "zdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9u" - + "L0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZ" - + "W5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABH" - + "R5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABXRpbWUwc3IAGmphdmEudXR" - + "pbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2" - + "YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5c" - + "GUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU" - + "1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q" - + "2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3N" - + "xbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHBy" - + "ZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvd" - + "XRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQACWRhdG" - + "V0aW1lMHNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09" - + "iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNl" - + "YXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAA" - + "AASAAB4cHQACVRJTUVTVEFNUHEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZ" - + "EV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG" - + "9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGV" - + "uc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwA" - + "BXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9Fe" - + "HByVHlwZTt4cHQABG51bTFzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTG" - + "phdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgA" - + "pb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcu" - + "RW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVz" - + "c2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZ" - + "WdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3" - + "IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF" - + "0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2Rh" - + "dGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAA" - + "VsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAA" - + "ABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5" - + "qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5" - + "zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJp" - + "bmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZ" - + "XEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxW" - + "vMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWF" - + "yY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQACWRhdGV0aW1lMXNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5" - + "TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7r" - + "dJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQ" - + "AAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAASAAB4cHQACVRJTUVTVEFNUHEAfgAI,rO0ABXNyAC" - + "1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAA" - + "STGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4" - + "cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZ" - + "UV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cG" - + "V0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTRzcgAaamF2YS51dGlsLkF" - + "ycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxh" - + "bmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5Fe" - + "HByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA" - + "==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0" - + "wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHB" - + "yZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9u" - + "LlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9Ma" - + "XN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWJvb2wxc3IAGm" - + "phdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXI" - + "AE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5k" - + "YXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQ" - + "k9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274h" - + "hKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2Vhcm" - + "NoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5" - + "leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGph" - + "dmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQAA" - + "2tleXNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iam" - + "VjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0cCAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJ" - + "jaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAAEgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAAS" - + "AAB4cHQABlNUUklOR3EAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJl" - + "c3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vc" - + "GVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2Vhcm" - + "NoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGh" - + "zdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlw" - + "ZTt4cHQABG51bTNzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvb" - + "GFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm" - + "9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQA" - + "AAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5" - + "OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVk" - + "dAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZ" - + "y5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH" - + "4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHl" - + "wZS9FeHByVHlwZTt4cHQABWJvb2wwc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFh" - + "dAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAf" - + "gAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS" - + "5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAHQk9PTEVBTnEAfgAI,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZX" - + "hwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAA" - + "JZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgAB" - + "eHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA" - + "0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3" - + "FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABG51bTJzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheUxpc3TZpDy+zYg" - + "G0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63SVufpHXtHAgAA" - + "eHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUAAAAAAAAAABIAA" - + "HhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZET1VCTEVxAH4ACA==,rO0ABXNyAC1vcmcub3BlbnNlY" - + "XJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy" - + "9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAA" - + "EbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274" - + "AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZ" - + "W5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABHN0cjBzcgAaamF2YS51dGlsLkFycmF5cyRBcnJheU" - + "xpc3TZpDy+zYgG0gIAAVsAAWF0ABNbTGphdmEvbGFuZy9PYmplY3Q7eHB1cgATW0xqYXZhLmxhbmcuU3RyaW5nO63" - + "SVufpHXtHAgAAeHAAAAABcQB+AAh+cgApb3JnLm9wZW5zZWFyY2guc3FsLmRhdGEudHlwZS5FeHByQ29yZVR5cGUA" - + "AAAAAAAAABIAAHhyAA5qYXZhLmxhbmcuRW51bQAAAAAAAAAAEgAAeHB0AAZTVFJJTkdxAH4ACA==,rO0ABXNyAC1v" - + "cmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAAST" - + "GphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cH" - + "Jlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV" - + "4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0" - + "ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWRhdGUzc3IAGmphdmEudXRpbC5Bc" - + "nJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW" - + "5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXh" - + "wckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+" - + "AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIA" - + "A0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9le" - + "HByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW" - + "9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9" - + "MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQABWRhdGUyc3IA" - + "GmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwd" - + "XIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC" - + "5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAA" - + "JVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3N" - + "pb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVu" - + "c2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoL" - + "nNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdA" - + "AQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt" - + "4cHQABWRhdGUxc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAFhdAATW0xqYXZhL2xh" - + "bmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEAfgAIfnIAKW9yZy5vc" - + "GVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2YS5sYW5nLkVudW0AAA" - + "AAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zcWwuZXhwcmVzc2lvbi" - + "5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbmc7TAAJZGVsZWdhdGV" - + "kdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXEAfgABeHBwc3IAMW9y" - + "Zy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvMkAIAA0wABGF0dHJxA" - + "H4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY2gvc3FsL2RhdGEvdH" - + "lwZS9FeHByVHlwZTt4cHQABWRhdGUwc3IAGmphdmEudXRpbC5BcnJheXMkQXJyYXlMaXN02aQ8vs2IBtICAAFbAAF" - + "hdAATW0xqYXZhL2xhbmcvT2JqZWN0O3hwdXIAE1tMamF2YS5sYW5nLlN0cmluZzut0lbn6R17RwIAAHhwAAAAAXEA" - + "fgAIfnIAKW9yZy5vcGVuc2VhcmNoLnNxbC5kYXRhLnR5cGUuRXhwckNvcmVUeXBlAAAAAAAAAAASAAB4cgAOamF2Y" - + "S5sYW5nLkVudW0AAAAAAAAAABIAAHhwdAAJVElNRVNUQU1QcQB+AAg=,rO0ABXNyAC1vcmcub3BlbnNlYXJjaC5zc" - + "WwuZXhwcmVzc2lvbi5OYW1lZEV4cHJlc3Npb274hhKW/q2YQQIAA0wABWFsaWFzdAASTGphdmEvbGFuZy9TdHJpbm" - + "c7TAAJZGVsZWdhdGVkdAAqTG9yZy9vcGVuc2VhcmNoL3NxbC9leHByZXNzaW9uL0V4cHJlc3Npb247TAAEbmFtZXE" - + "AfgABeHBwc3IAMW9yZy5vcGVuc2VhcmNoLnNxbC5leHByZXNzaW9uLlJlZmVyZW5jZUV4cHJlc3Npb274AO0rxWvM" - + "kAIAA0wABGF0dHJxAH4AAUwABXBhdGhzdAAQTGphdmEvdXRpbC9MaXN0O0wABHR5cGV0ACdMb3JnL29wZW5zZWFyY" - + "2gvc3FsL2RhdGEvdHlwZS9FeHByVHlwZTt4cHQAA3p6enNyABpqYXZhLnV0aWwuQXJyYXlzJEFycmF5TGlzdNmkPL" - + "7NiAbSAgABWwABYXQAE1tMamF2YS9sYW5nL09iamVjdDt4cHVyABNbTGphdmEubGFuZy5TdHJpbmc7rdJW5+kde0c" - + "CAAB4cAAAAAFxAH4ACH5yAClvcmcub3BlbnNlYXJjaC5zcWwuZGF0YS50eXBlLkV4cHJDb3JlVHlwZQAAAAAAAAAA" - + "EgAAeHIADmphdmEubGFuZy5FbnVtAAAAAAAAAAASAAB4cHQABlNUUklOR3EAfgAI),(OpenSearchPagedIndexSc" - + "an,calcs,FGluY2x1ZGVfY29udGV4dF91dWlkDXF1ZXJ5QW5kRmV0Y2gBFndYQmJZcHpxU3dtc1hUVkhhYU1uLVEA" - + "AAAAAAAADRY4RzRudHZqbFI0dTBFdkJNZEpCaDd3)))"; - - private static final String testIndexName = "dummyIndex"; - private static final String testScroll = "dummyScroll"; - @BeforeEach void setUp() { storageEngine = mock(StorageEngine.class); - when(storageEngine.getTableScan(anyString(), anyString())) - .thenReturn(new MockedTableScanOperator()); planCache = new PlanSerializer(storageEngine); } @@ -296,10 +68,12 @@ void canConvertToCursor_project_some_fields_relation() { } @ParameterizedTest - @ValueSource(strings = {"pewpew", "asdkfhashdfjkgakgfwuigfaijkb", testCursor}) - void compress_decompress(String input) { - var compressed = compress(input); - assertEquals(input, decompress(compressed)); + @ValueSource(strings = {"pewpew", "asdkfhashdfjkgakgfwuigfaijkb", "ajdhfgajklghadfjkhgjkadhgad" + + "kadfhgadhjgfjklahdgqheygvskjfbvgsdklgfuirehiluANUIfgauighbahfuasdlhfnhaughsdlfhaughaggf" + + "and_some_other_funny_stuff_which_could_be_generated_while_sleeping_on_the_keyboard"}) + void serialize_deserialize_str(String input) { + var compressed = serialize(input); + assertEquals(input, deserialize(compressed)); if (input.length() > 200) { // Compression of short strings isn't profitable, because encoding into string and gzip // headers add more bytes than input string has. @@ -307,153 +81,176 @@ void compress_decompress(String input) { } } + public static class SerializableTestClass implements Serializable { + public int field; + + @Override + public boolean equals(Object obj) { + return field == ((SerializableTestClass) obj).field; + } + } + + // Can't serialize private classes because they are not accessible + private class NotSerializableTestClass implements Serializable { + public int field; + + @Override + public boolean equals(Object obj) { + return field == ((SerializableTestClass) obj).field; + } + } + @Test - // should never happen actually, at least for compress - void compress_decompress_null_or_empty_string() { - assertAll( - () -> assertTrue(compress(null).isEmpty()), - () -> assertTrue(compress("").isEmpty()), - () -> assertTrue(decompress(null).isEmpty()), - () -> assertTrue(decompress("").isEmpty()) - ); + void serialize_deserialize_obj() { + var obj = new SerializableTestClass(); + obj.field = 42; + assertEquals(obj, deserialize(serialize(obj))); + assertNotSame(obj, deserialize(serialize(obj))); } @Test - // test added for coverage only - void compress_throws() { - var mock = Mockito.mockConstructionWithAnswer(GZIPOutputStream.class, invocation -> null); - assertThrows(Throwable.class, () -> compress("\\_(`v`)_/")); - mock.close(); + void serialize_throws() { + assertThrows(Throwable.class, () -> serialize(new NotSerializableTestClass())); + var testObj = new TestOperator(); + testObj.throwIoOnWrite = true; + assertThrows(Throwable.class, () -> serialize(testObj)); } @Test - void decompress_throws() { + void deserialize_throws() { assertAll( // from gzip - damaged header - () -> assertThrows(Throwable.class, () -> decompress("00")), + () -> assertThrows(Throwable.class, () -> deserialize("00")), // from HashCode::fromString - () -> assertThrows(Throwable.class, () -> decompress("000")) + () -> assertThrows(Throwable.class, () -> deserialize("000")) ); } @Test @SneakyThrows - void convert_deconvert_cursor() { - var cursor = buildCursor(Map.of()); - var plan = planCache.convertToPlan(cursor); - // `PaginateOperator::toCursor` shifts cursor to the next page. To have this test consistent - // we have to enforce it staying on the same page. This allows us to get same cursor strings. - var pageNum = (int)FieldUtils.readField(plan, "pageIndex", true); - FieldUtils.writeField(plan, "pageIndex", pageNum - 1, true); - var convertedCursor = planCache.convertToCursor(plan).toString(); - // Then we have to restore page num into the plan, otherwise comparison would fail due to this. - FieldUtils.writeField(plan, "pageIndex", pageNum, true); - var convertedPlan = planCache.convertToPlan(convertedCursor); - assertEquals(cursor, convertedCursor); - // TODO compare plans + void convertToCursor_returns_no_cursor_if_cant_serialize() { + var plan = new TestOperator(42); + plan.throwNoCursorOnWrite = true; + assertAll( + () -> assertThrows(NoCursorException.class, () -> serialize(plan)), + () -> assertEquals(Cursor.None, planCache.convertToCursor(plan)) + ); } @Test @SneakyThrows - void convertToCursor_cant_convert() { - var plan = mock(MockedTableScanOperator.class); + void convertToCursor_returns_no_cursor_if_plan_is_not_paginate() { + var plan = mock(PhysicalPlan.class); assertEquals(Cursor.None, planCache.convertToCursor(plan)); - when(plan.toCursor()).thenReturn(""); - assertEquals(Cursor.None, planCache.convertToCursor( - new PaginateOperator(plan, 1, 2))); } @Test - void converted_plan_is_executable() { - // planCache.convertToPlan(buildCursor(Map.of())); - var plan = planCache.convertToPlan("n:" + compress(testCursor)); - // TODO + void convertToPlan_throws_cursor_has_no_prefix() { + assertThrows(UnsupportedOperationException.class, () -> + planCache.convertToPlan("abc")); } - @ParameterizedTest - @MethodSource("generateIncorrectCursors") - void throws_on_parsing_damaged_cursor(String cursor) { - assertThrows(Throwable.class, () -> planCache.convertToPlan(cursor)); + @Test + void convertToPlan_throws_if_failed_to_deserialize() { + assertThrows(UnsupportedOperationException.class, () -> + planCache.convertToPlan("n:" + serialize(mock(Serializable.class)))); + } + + @Test + @SneakyThrows + void serialize_and_deserialize() { + var plan = new TestOperator(42); + var roundTripPlan = planCache.deserialize(planCache.serialize(plan)); + assertEquals(roundTripPlan, plan); + assertNotSame(roundTripPlan, plan); } - private static Stream generateIncorrectCursors() { - return Stream.of( - compress(testCursor), // a valid cursor, but without "n:" prefix - "n:" + testCursor, // a valid, but uncompressed cursor - buildCursor(Map.of("prefix", "g:")), // incorrect prefix - buildCursor(Map.of("header: paginate", "ORDER BY")), // incorrect header - buildCursor(Map.of("pageIndex", "")), // incorrect page # - buildCursor(Map.of("pageIndex", "abc")), // incorrect page # - buildCursor(Map.of("pageSize", "abc")), // incorrect page size - buildCursor(Map.of("pageSize", "null")), // incorrect page size - buildCursor(Map.of("pageSize", "10 ")), // incorrect page size - buildCursor(Map.of("header: project", "")), // incorrect header - buildCursor(Map.of("header: namedParseExpressions", "ololo")), // incorrect header - buildCursor(Map.of("namedParseExpressions", "pewpew")), // incorrect (unparsable) npes - buildCursor(Map.of("namedParseExpressions", "rO0ABXA=,")), // incorrect npes (extra comma) - buildCursor(Map.of("header: projectList", "")), // incorrect header - buildCursor(Map.of("projectList", "\0\0\0\0")), // incorrect project - buildCursor(Map.of("header: OpenSearchPagedIndexScan", "42")) // incorrect header - ).map(Arguments::of); + @Test + void convertToCursor_and_convertToPlan() { + var plan = new TestOperator(100500); + var roundTripPlan = (SerializablePlan) + planCache.convertToPlan(planCache.convertToCursor(plan).toString()); + assertEquals(plan, roundTripPlan); + assertNotSame(plan, roundTripPlan); } + @Test + @SneakyThrows + void resolveObject() { + ByteArrayOutputStream output = new ByteArrayOutputStream(); + ObjectOutputStream objectOutput = new ObjectOutputStream(output); + objectOutput.writeObject("Hello, world!"); + objectOutput.flush(); - /** - * Function puts default valid values into generated cursor string. - * Values could be redefined. - * @param values A map of non-default values to use. - * @return A compressed cursor string. - */ - public static String buildCursor(Map values) { - String prefix = values.getOrDefault("prefix", "n:"); - String headerPaginate = values.getOrDefault("header: paginate", "Paginate"); - String pageIndex = values.getOrDefault("pageIndex", "1"); - String pageSize = values.getOrDefault("pageSize", "2"); - String headerProject = values.getOrDefault("header: project", "Project"); - String headerNpes = values.getOrDefault("header: namedParseExpressions", - "namedParseExpressions"); - String namedParseExpressions = values.getOrDefault("namedParseExpressions", ""); - String headerProjectList = values.getOrDefault("header: projectList", "projectList"); - String projectList = values.getOrDefault("projectList", "rO0ABXA="); // serialized `null` - String headerOspis = values.getOrDefault("header: OpenSearchPagedIndexScan", - "OpenSearchPagedIndexScan"); - String indexName = values.getOrDefault("indexName", testIndexName); - String scrollId = values.getOrDefault("scrollId", testScroll); - var cursor = String.format("(%s,%s,%s,(%s,(%s,%s),(%s,%s),(%s,%s,%s)))", headerPaginate, - pageIndex, pageSize, headerProject, headerNpes, namedParseExpressions, headerProjectList, - projectList, headerOspis, indexName, scrollId); - return prefix + compress(cursor); + var cds = planCache.getCursorDeserializationStream( + new ByteArrayInputStream(output.toByteArray())); + assertEquals(storageEngine, cds.resolveObject("engine")); + var object = new Object(); + assertSame(object, cds.resolveObject(object)); } - private static class MockedTableScanOperator extends TableScanOperator { + // Helpers and auxiliary classes section below + + public static class TestOperator extends PhysicalPlan implements SerializablePlan { + private int field; + private boolean throwNoCursorOnWrite = false; + private boolean throwIoOnWrite = false; + + public TestOperator() { + } + + public TestOperator(int value) { + field = value; + } + @Override - public boolean hasNext() { - return false; + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + field = in.readInt(); } @Override - public ExprValue next() { + public void writeExternal(ObjectOutput out) throws IOException { + if (throwNoCursorOnWrite) { + throw new NoCursorException(); + } + if (throwIoOnWrite) { + throw new IOException(); + } + out.writeInt(field); + } + + @Override + public boolean equals(Object o) { + return field == ((TestOperator) o).field; + } + + @Override + public R accept(PhysicalPlanNodeVisitor visitor, C context) { return null; } @Override - public String explain() { + public boolean hasNext() { + return false; + } + + @Override + public ExprValue next() { return null; } @Override - public String toCursor() { - return createSection("OpenSearchPagedIndexScan", testIndexName, testScroll); + public List getChild() { + return null; } } @SneakyThrows - private static String compress(String input) { - return new PlanSerializer(null).compress(input); + private String serialize(Serializable input) { + return new PlanSerializer(null).serialize(input); } - @SneakyThrows - private static String decompress(String input) { - return new PlanSerializer(null).decompress(input); + private Serializable deserialize(String input) { + return new PlanSerializer(null).deserialize(input); } } diff --git a/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java b/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java new file mode 100644 index 00000000000..e40ce5031b8 --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java @@ -0,0 +1,39 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner; + +import static org.junit.jupiter.api.Assertions.assertSame; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Answers.CALLS_REAL_METHODS; + +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +@ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) +public class SerializablePlanTest { + @Mock(answer = CALLS_REAL_METHODS) + SerializablePlan plan; + + @Test + void writeExternal_throws() { + assertThrows(Throwable.class, () -> plan.writeExternal(null)); + } + + @Test + void readExternal_throws() { + assertThrows(Throwable.class, () -> plan.readExternal(null)); + } + + @Test + void getPlanForSerialization_defaults_to_self() { + assertSame(plan, plan.getPlanForSerialization()); + } +} diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java index 3e0efc3b50b..2405700f10b 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java @@ -6,7 +6,6 @@ package org.opensearch.sql.planner.physical; -import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNull; @@ -29,6 +28,7 @@ import org.junit.jupiter.api.Test; import org.opensearch.sql.data.model.ExprIntegerValue; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.planner.SerializablePlan; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) public class PaginateOperatorTest extends PhysicalPlanTestBase { @@ -90,15 +90,10 @@ public void schema_assert() { } @Test - public void toCursor() { - var plan = mock(PhysicalPlan.class); - when(plan.toCursor()).thenReturn("Great plan, Walter, reliable as a swiss watch!", "", null); - var po = new PaginateOperator(plan, 2); - assertAll( - () -> assertEquals("(Paginate,1,2,Great plan, Walter, reliable as a swiss watch!)", - po.toCursor()), - () -> assertNull(po.toCursor()), - () -> assertNull(po.toCursor()) - ); + // PaginateOperator implements SerializablePlan, but not being serialized + public void serializable_but_not_serialized() { + var plan = mock(PhysicalPlan.class, withSettings().extraInterfaces(SerializablePlan.class)); + var paginate = new PaginateOperator(plan, 1, 1); + assertSame(plan, paginate.getPlanForSerialization()); } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java index 5e70f2b9d01..2c67994d2ec 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanTest.java @@ -78,19 +78,4 @@ void get_total_hits_uses_default_value() { when(plan.getTotalHits()).then(CALLS_REAL_METHODS); assertEquals(0, plan.getTotalHits()); } - - @Test - void toCursor() { - var plan = mock(PhysicalPlan.class); - when(plan.toCursor()).then(CALLS_REAL_METHODS); - assertTrue(assertThrows(IllegalStateException.class, plan::toCursor) - .getMessage().contains("is not compatible with cursor feature")); - } - - @Test - void createSection() { - var plan = mock(PhysicalPlan.class); - when(plan.createSection(anyString(), any())).then(CALLS_REAL_METHODS); - assertEquals("(plan,one,two)", plan.createSection("plan", "one", "two")); - } } diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java index 6042eba6dcc..989cdf7471e 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java @@ -11,9 +11,7 @@ import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.hasItems; import static org.hamcrest.Matchers.iterableWithSize; -import static org.junit.jupiter.api.Assertions.assertAll; import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertNull; import static org.mockito.Mockito.when; import static org.opensearch.sql.data.model.ExprValueUtils.LITERAL_MISSING; import static org.opensearch.sql.data.model.ExprValueUtils.stringValue; @@ -23,7 +21,16 @@ import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectInputStream; +import java.io.ObjectOutput; +import java.io.ObjectOutputStream; import java.util.List; +import lombok.EqualsAndHashCode; +import lombok.SneakyThrows; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -33,7 +40,7 @@ import org.opensearch.sql.data.model.ExprValueUtils; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.expression.DSL; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; +import org.opensearch.sql.planner.SerializablePlan; @ExtendWith(MockitoExtension.class) class ProjectOperatorTest extends PhysicalPlanTestBase { @@ -212,18 +219,51 @@ public void project_parse_missing_will_fallback() { } @Test - public void toCursor() { - when(inputPlan.toCursor()).thenReturn("inputPlan", "", null); - var project = DSL.named("response", DSL.ref("response", INTEGER)); - var npe = DSL.named("action", DSL.ref("action", STRING)); - var po = project(inputPlan, List.of(project), List.of(npe)); - var serializer = new DefaultExpressionSerializer(); - var expected = String.format("(Project,(namedParseExpressions,%s),(projectList,%s),%s)", - serializer.serialize(npe), serializer.serialize(project), "inputPlan"); - assertAll( - () -> assertEquals(expected, po.toCursor()), - () -> assertNull(po.toCursor()), - () -> assertNull(po.toCursor()) - ); + @SneakyThrows + public void serializable() { + var projects = List.of(DSL.named("action", DSL.ref("action", STRING))); + var project = new ProjectOperator(new TestOperator(), projects, List.of()); + + ByteArrayOutputStream output = new ByteArrayOutputStream(); + ObjectOutputStream objectOutput = new ObjectOutputStream(output); + objectOutput.writeObject(project); + objectOutput.flush(); + + ObjectInputStream objectInput = new ObjectInputStream( + new ByteArrayInputStream(output.toByteArray())); + var roundTripPlan = (ProjectOperator) objectInput.readObject(); + assertEquals(project, roundTripPlan); + } + + @EqualsAndHashCode + public static class TestOperator extends PhysicalPlan implements SerializablePlan { + + @Override + public R accept(PhysicalPlanNodeVisitor visitor, C context) { + return null; + } + + @Override + public boolean hasNext() { + return false; + } + + @Override + public ExprValue next() { + return null; + } + + @Override + public List getChild() { + return null; + } + + @Override + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + } + + @Override + public void writeExternal(ObjectOutput out) throws IOException { + } } } diff --git a/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java b/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java index 9c96459d061..67014b76bdc 100644 --- a/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java +++ b/core/src/test/java/org/opensearch/sql/storage/StorageEngineTest.java @@ -18,11 +18,4 @@ void testFunctionsMethod() { StorageEngine k = (dataSourceSchemaName, tableName) -> null; Assertions.assertEquals(Collections.emptyList(), k.getFunctions()); } - - @Test - void getTableScan() { - StorageEngine k = (dataSourceSchemaName, tableName) -> null; - Assertions.assertThrows(UnsupportedOperationException.class, - () -> k.getTableScan("indexName", "scrollId")); - } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java index 034f9227eea..b06e2b9e089 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java @@ -17,6 +17,7 @@ import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.databind.ObjectMapper; import com.google.common.collect.ImmutableMap; +import java.io.Serializable; import java.time.Instant; import java.time.format.DateTimeParseException; import java.util.ArrayList; @@ -54,7 +55,7 @@ /** * Construct ExprValue from OpenSearch response. */ -public class OpenSearchExprValueFactory { +public class OpenSearchExprValueFactory implements Serializable { /** * The Mapping of Field and ExprType. */ diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java index 3d880d82b9f..78283307510 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java @@ -12,6 +12,7 @@ import lombok.ToString; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.monitor.ResourceMonitor; +import org.opensearch.sql.planner.SerializablePlan; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; @@ -21,7 +22,7 @@ @ToString @RequiredArgsConstructor @EqualsAndHashCode -public class ResourceMonitorPlan extends PhysicalPlan { +public class ResourceMonitorPlan extends PhysicalPlan implements SerializablePlan { /** * How many method calls to delegate's next() to perform resource check once. @@ -88,8 +89,9 @@ public long getTotalHits() { return delegate.getTotalHits(); } + @Override - public String toCursor() { - return delegate.toCursor(); + public SerializablePlan getPlanForSerialization() { + return (SerializablePlan) delegate; } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java index 149e7a55414..4ffbbee9b70 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java @@ -29,6 +29,7 @@ public class ContinuePageRequestBuilder extends PagedRequestBuilder { @Getter private final OpenSearchRequest.IndexName indexName; + @Getter private final String scrollId; private final TimeValue scrollTimeout; private final OpenSearchExprValueFactory exprValueFactory; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java index a44a30bf8d5..bef734ce476 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java @@ -39,8 +39,8 @@ public class InitialPageRequestBuilder extends PagedRequestBuilder { /** * Constructor. - * - * @param indexName index being scanned + * @param indexName index being scanned + * @param pageSize page size * @param exprValueFactory value factory */ // TODO accept indexName as string (same way as `OpenSearchRequestBuilder` does)? diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/setting/OpenSearchSettings.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/setting/OpenSearchSettings.java index ae5174d678f..accd3560417 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/setting/OpenSearchSettings.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/setting/OpenSearchSettings.java @@ -99,8 +99,8 @@ public class OpenSearchSettings extends Settings { Setting.Property.Dynamic); /** - * Construct ElasticsearchSetting. - * The ElasticsearchSetting must be singleton. + * Construct OpenSearchSetting. + * The OpenSearchSetting must be singleton. */ @SuppressWarnings("unchecked") public OpenSearchSettings(ClusterSettings clusterSettings) { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java index 14535edb79a..c915fa549bd 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngine.java @@ -8,26 +8,23 @@ import static org.opensearch.sql.utils.SystemIndexUtils.isSystemIndex; +import lombok.Getter; import lombok.RequiredArgsConstructor; import org.opensearch.sql.DataSourceSchemaName; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.client.OpenSearchClient; -import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; -import org.opensearch.sql.opensearch.request.ContinuePageRequestBuilder; -import org.opensearch.sql.opensearch.request.OpenSearchRequest; -import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; import org.opensearch.sql.opensearch.storage.system.OpenSearchSystemIndex; import org.opensearch.sql.storage.StorageEngine; import org.opensearch.sql.storage.Table; -import org.opensearch.sql.storage.TableScanOperator; /** OpenSearch storage engine implementation. */ @RequiredArgsConstructor public class OpenSearchStorageEngine implements StorageEngine { /** OpenSearch client connection. */ + @Getter private final OpenSearchClient client; - + @Getter private final Settings settings; @Override @@ -38,15 +35,4 @@ public Table getTable(DataSourceSchemaName dataSourceSchemaName, String name) { return new OpenSearchIndex(client, settings, name); } } - - @Override - public TableScanOperator getTableScan(String indexName, String scrollId) { - // TODO call `getTable` here? - var index = new OpenSearchIndex(client, settings, indexName); - var requestBuilder = new ContinuePageRequestBuilder( - new OpenSearchRequest.IndexName(indexName), - scrollId, settings, - new OpenSearchExprValueFactory(index.getFieldOpenSearchTypes())); - return new OpenSearchPagedIndexScan(client, requestBuilder); - } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java index 4571961e5fe..74be670dcc1 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanAggregationBuilder.java @@ -15,9 +15,9 @@ import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.storage.script.aggregation.AggregationQueryBuilder; +import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.planner.logical.LogicalAggregation; import org.opensearch.sql.planner.logical.LogicalSort; import org.opensearch.sql.storage.TableScanOperator; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java index f2e5139d01d..7e6c169a88e 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java @@ -20,9 +20,9 @@ import org.opensearch.sql.expression.ExpressionNodeVisitor; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.FilterQueryBuilder; import org.opensearch.sql.opensearch.storage.script.sort.SortQueryBuilder; +import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java index e9d3fd52d39..3667a3ffdfc 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScan.java @@ -5,31 +5,42 @@ package org.opensearch.sql.opensearch.storage.scan; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; import java.util.Collections; import java.util.Iterator; import lombok.EqualsAndHashCode; +import lombok.Getter; import lombok.ToString; import org.apache.commons.lang3.NotImplementedException; import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.exception.NoCursorException; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.opensearch.client.OpenSearchClient; +import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.request.ContinuePageRequestBuilder; import org.opensearch.sql.opensearch.request.OpenSearchRequest; import org.opensearch.sql.opensearch.request.PagedRequestBuilder; import org.opensearch.sql.opensearch.response.OpenSearchResponse; +import org.opensearch.sql.opensearch.storage.OpenSearchIndex; +import org.opensearch.sql.opensearch.storage.OpenSearchStorageEngine; +import org.opensearch.sql.planner.SerializablePlan; import org.opensearch.sql.storage.TableScanOperator; @EqualsAndHashCode(onlyExplicitlyIncluded = true, callSuper = false) @ToString(onlyExplicitlyIncluded = true) -public class OpenSearchPagedIndexScan extends TableScanOperator { - private final OpenSearchClient client; - private final PagedRequestBuilder requestBuilder; +public class OpenSearchPagedIndexScan extends TableScanOperator implements SerializablePlan { + private OpenSearchClient client; + @Getter + private PagedRequestBuilder requestBuilder; @EqualsAndHashCode.Include @ToString.Include private OpenSearchRequest request; private Iterator iterator; private long totalHits = 0; - public OpenSearchPagedIndexScan(OpenSearchClient client, - PagedRequestBuilder requestBuilder) { + public OpenSearchPagedIndexScan(OpenSearchClient client, PagedRequestBuilder requestBuilder) { this.client = client; this.requestBuilder = requestBuilder; } @@ -73,12 +84,32 @@ public long getTotalHits() { return totalHits; } + /** Don't use, it is for deserialization needs only. */ + @Deprecated + public OpenSearchPagedIndexScan() { + } + @Override - public String toCursor() { - // TODO this assumes exactly one index is scanned. - var indexName = requestBuilder.getIndexName().getIndexNames()[0]; - var cursor = request.toCursor(); - return cursor == null || cursor.isEmpty() - ? "" : createSection("OpenSearchPagedIndexScan", indexName, cursor); + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + var engine = (OpenSearchStorageEngine) ((PlanSerializer.CursorDeserializationStream) in) + .resolveObject("engine"); + var indexName = (String) in.readUTF(); + var scrollId = (String) in.readUTF(); + client = engine.getClient(); + var index = new OpenSearchIndex(client, engine.getSettings(), indexName); + requestBuilder = new ContinuePageRequestBuilder( + new OpenSearchRequest.IndexName(indexName), + scrollId, engine.getSettings(), + new OpenSearchExprValueFactory(index.getFieldOpenSearchTypes())); + } + + @Override + public void writeExternal(ObjectOutput out) throws IOException { + if (request.toCursor() == null || request.toCursor().isEmpty()) { + throw new NoCursorException(); + } + + out.writeUTF(requestBuilder.getIndexName().toString()); + out.writeUTF(request.toCursor()); } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java index 9e8b47f6b05..855aae645d2 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngine.java @@ -16,9 +16,9 @@ import org.opensearch.script.ScriptContext; import org.opensearch.script.ScriptEngine; import org.opensearch.sql.expression.Expression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.aggregation.ExpressionAggregationScriptFactory; import org.opensearch.sql.opensearch.storage.script.filter.ExpressionFilterScriptFactory; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Custom expression script engine that supports using core engine expression code in DSL diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java index bc9741dee51..8b1cb08cfac 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilder.java @@ -29,7 +29,6 @@ import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.response.agg.CompositeAggregationParser; import org.opensearch.sql.opensearch.response.agg.MetricParser; @@ -37,6 +36,7 @@ import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.storage.script.aggregation.dsl.BucketAggregationBuilder; import org.opensearch.sql.opensearch.storage.script.aggregation.dsl.MetricAggregationBuilder; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Build the AggregationBuilder from the list of {@link NamedAggregator} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java index 83dd9276326..156b565976a 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/AggregationBuilderHelper.java @@ -17,8 +17,8 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Abstract Aggregation Builder. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java index 215be3b3565..1a6a82be966 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilder.java @@ -23,8 +23,8 @@ import org.opensearch.search.sort.SortOrder; import org.opensearch.sql.ast.expression.SpanUnit; import org.opensearch.sql.expression.NamedExpression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.expression.span.SpanExpression; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Bucket Aggregation Builder. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java index db8d1fdf1eb..5e7d34abce0 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilder.java @@ -25,13 +25,13 @@ import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.NamedAggregator; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.response.agg.FilterParser; import org.opensearch.sql.opensearch.response.agg.MetricParser; import org.opensearch.sql.opensearch.response.agg.SingleValueParser; import org.opensearch.sql.opensearch.response.agg.StatsParser; import org.opensearch.sql.opensearch.response.agg.TopHitsParser; import org.opensearch.sql.opensearch.storage.script.filter.FilterQueryBuilder; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; /** * Build the Metric Aggregation and List of {@link MetricParser} from {@link NamedAggregator}. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java index a82869ec038..5f36954d4a7 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilder.java @@ -24,7 +24,6 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.function.BuiltinFunctionName; import org.opensearch.sql.expression.function.FunctionName; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.lucene.LikeQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.LuceneQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.RangeQuery; @@ -39,6 +38,7 @@ import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.QueryStringQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.SimpleQueryStringQuery; import org.opensearch.sql.opensearch.storage.script.filter.lucene.relevance.WildcardQuery; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @RequiredArgsConstructor public class FilterQueryBuilder extends ExpressionNodeVisitor { diff --git a/core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java similarity index 95% rename from core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java rename to opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java index 33c22b2ea5d..dc67da9de5d 100644 --- a/core/src/main/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializer.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializer.java @@ -4,7 +4,7 @@ */ -package org.opensearch.sql.expression.serialization; +package org.opensearch.sql.opensearch.storage.serialization; import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; diff --git a/core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java similarity index 90% rename from core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java rename to opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java index f96921e29c3..b7caeb30f81 100644 --- a/core/src/main/java/org/opensearch/sql/expression/serialization/ExpressionSerializer.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/serialization/ExpressionSerializer.java @@ -4,7 +4,7 @@ */ -package org.opensearch.sql.expression.serialization; +package org.opensearch.sql.opensearch.storage.serialization; import org.opensearch.sql.expression.Expression; diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index 32f812bfb63..1f13470a439 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -22,6 +22,9 @@ import static org.opensearch.sql.data.model.ExprValueUtils.tupleValue; import static org.opensearch.sql.executor.ExecutionEngine.QueryResponse; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; import java.util.ArrayList; import java.util.Arrays; import java.util.Iterator; @@ -49,6 +52,7 @@ import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; +import org.opensearch.sql.planner.SerializablePlan; import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.TableScanOperator; @@ -293,20 +297,23 @@ public ExprValue next() { public ExecutionEngine.Schema schema() { return input.schema(); } - - @Override - public String toCursor() { - return "FakePaginatePlan"; - } } @RequiredArgsConstructor - private static class FakePhysicalPlan extends TableScanOperator { + private static class FakePhysicalPlan extends TableScanOperator implements SerializablePlan { private final Iterator it; private boolean hasOpen; private boolean hasClosed; private boolean hasSplit; + @Override + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + } + + @Override + public void writeExternal(ObjectOutput out) throws IOException { + } + @Override public void open() { super.open(); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java index 7b1353f4a97..9ff7c093201 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java @@ -8,9 +8,11 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; +import static org.mockito.Mockito.withSettings; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; @@ -19,6 +21,7 @@ import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.sql.monitor.ResourceMonitor; import org.opensearch.sql.opensearch.executor.protector.ResourceMonitorPlan; +import org.opensearch.sql.planner.SerializablePlan; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; @@ -115,8 +118,9 @@ void getTotalHitsSuccess() { } @Test - void toCursorSuccess() { - monitorPlan.toCursor(); - verify(plan, times(1)).toCursor(); + void getPlanForSerialization() { + plan = mock(PhysicalPlan.class, withSettings().extraInterfaces(SerializablePlan.class)); + monitorPlan = new ResourceMonitorPlan(plan, resourceMonitor); + assertEquals(plan, monitorPlan.getPlanForSerialization()); } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java index 6a8727e0fbc..1089e7e2520 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchStorageEngineTest.java @@ -7,17 +7,11 @@ package org.opensearch.sql.opensearch.storage; import static org.junit.jupiter.api.Assertions.assertAll; -import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.ArgumentMatchers.anyString; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; import static org.opensearch.sql.utils.SystemIndexUtils.TABLE_INFO; -import java.util.Map; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; @@ -25,8 +19,6 @@ import org.opensearch.sql.DataSourceSchemaName; import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.opensearch.client.OpenSearchClient; -import org.opensearch.sql.opensearch.response.OpenSearchResponse; -import org.opensearch.sql.opensearch.storage.scan.OpenSearchPagedIndexScan; import org.opensearch.sql.opensearch.storage.system.OpenSearchSystemIndex; import org.opensearch.sql.storage.Table; @@ -60,21 +52,4 @@ public void getSystemTable() { () -> assertTrue(table instanceof OpenSearchSystemIndex) ); } - - @Test - public void getTableScan() { - when(client.getIndexMappings(anyString())).thenReturn(Map.of()); - OpenSearchResponse response = mock(); - when(response.isEmpty()).thenReturn(true); - when(client.search(any())).thenReturn(response); - OpenSearchStorageEngine engine = new OpenSearchStorageEngine(client, settings); - var scan = engine.getTableScan("test", "test"); - assertAll( - () -> assertTrue(scan instanceof OpenSearchPagedIndexScan), - () -> { - scan.open(); - assertFalse(scan.hasNext()); - } - ); - } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java index 38888115c91..cd941540126 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchPagedIndexScanTest.java @@ -22,6 +22,12 @@ import static org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScanTest.mockResponse; import com.google.common.collect.ImmutableMap; +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.ObjectInputStream; +import java.io.ObjectOutputStream; +import java.util.Map; +import lombok.SneakyThrows; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; @@ -29,6 +35,8 @@ import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.exception.NoCursorException; +import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; @@ -37,6 +45,7 @@ import org.opensearch.sql.opensearch.request.OpenSearchRequest; import org.opensearch.sql.opensearch.request.PagedRequestBuilder; import org.opensearch.sql.opensearch.response.OpenSearchResponse; +import org.opensearch.sql.opensearch.storage.OpenSearchStorageEngine; @ExtendWith(MockitoExtension.class) @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @@ -144,21 +153,63 @@ void explain_not_implemented() { } @Test - void toCursor() { + @SneakyThrows + void serialization() { PagedRequestBuilder builder = mock(); OpenSearchRequest request = mock(); OpenSearchResponse response = mock(); + when(request.toCursor()).thenReturn("cu-cursor"); when(builder.build()).thenReturn(request); - when(builder.getIndexName()).thenReturn(new OpenSearchRequest.IndexName("index")); - when(client.search(request)).thenReturn(response); - when(response.isEmpty()).thenReturn(true); - when(request.toCursor()).thenReturn("cu-cursor", "", null); + var indexName = new OpenSearchRequest.IndexName("index"); + when(builder.getIndexName()).thenReturn(indexName); + when(client.search(any())).thenReturn(response); OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder); indexScan.open(); - assertAll( - () -> assertEquals("(OpenSearchPagedIndexScan,index,cu-cursor)", indexScan.toCursor()), - () -> assertEquals("", indexScan.toCursor()), - () -> assertEquals("", indexScan.toCursor()) - ); + + ByteArrayOutputStream output = new ByteArrayOutputStream(); + ObjectOutputStream objectOutput = new ObjectOutputStream(output); + objectOutput.writeObject(indexScan); + objectOutput.flush(); + + when(client.getIndexMappings(any())).thenReturn(Map.of()); + OpenSearchStorageEngine engine = mock(); + when(engine.getClient()).thenReturn(client); + when(engine.getSettings()).thenReturn(mock()); + ObjectInputStream objectInput = new PlanSerializer(engine) + .getCursorDeserializationStream(new ByteArrayInputStream(output.toByteArray())); + var roundTripScan = (OpenSearchPagedIndexScan) objectInput.readObject(); + roundTripScan.open(); + + // indexScan's request could be a OpenSearchScrollRequest or a ContinuePageRequest, but + // roundTripScan's request is always a ContinuePageRequest + // Thus, we can't compare those scans + //assertEquals(indexScan, roundTripScan); + // But we can validate that index name and scroll was serialized-deserialized correctly + assertEquals(indexName, roundTripScan.getRequestBuilder().getIndexName()); + assertTrue(roundTripScan.getRequestBuilder() instanceof ContinuePageRequestBuilder); + assertEquals("cu-cursor", + ((ContinuePageRequestBuilder) roundTripScan.getRequestBuilder()).getScrollId()); + } + + @Test + @SneakyThrows + void dont_serialize_if_no_cursor() { + PagedRequestBuilder builder = mock(); + OpenSearchRequest request = mock(); + OpenSearchResponse response = mock(); + when(builder.build()).thenReturn(request); + when(client.search(any())).thenReturn(response); + OpenSearchPagedIndexScan indexScan = new OpenSearchPagedIndexScan(client, builder); + indexScan.open(); + + when(request.toCursor()).thenReturn(null, ""); + for (int i = 0; i < 2; i++) { + assertThrows(NoCursorException.class, () -> { + ByteArrayOutputStream output = new ByteArrayOutputStream(); + ObjectOutputStream objectOutput = new ObjectOutputStream(output); + objectOutput.writeObject(indexScan); + objectOutput.flush(); + }); + } } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java index a88d81c0201..3d497c2f5b7 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/ExpressionScriptEngineTest.java @@ -27,8 +27,8 @@ import org.opensearch.script.ScriptEngine; import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.Expression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.storage.script.filter.ExpressionFilterScriptFactory; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java index 474aba14206..e771e01bce6 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/AggregationQueryBuilderTest.java @@ -51,9 +51,9 @@ import org.opensearch.sql.expression.aggregation.AvgAggregator; import org.opensearch.sql.expression.aggregation.CountAggregator; import org.opensearch.sql.expression.aggregation.NamedAggregator; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java index eaeacd09ef0..f93c69de280 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/BucketAggregationBuilderTest.java @@ -46,9 +46,9 @@ import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.parse.ParseExpression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java index d8e81026b68..94f152f9132 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/aggregation/dsl/MetricAggregationBuilderTest.java @@ -43,7 +43,7 @@ import org.opensearch.sql.expression.aggregation.SumAggregator; import org.opensearch.sql.expression.aggregation.TakeAggregator; import org.opensearch.sql.expression.function.FunctionName; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java index 3b7865aa463..96245909a48 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/script/filter/FilterQueryBuilderTest.java @@ -53,9 +53,9 @@ import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.LiteralExpression; import org.opensearch.sql.expression.ReferenceExpression; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.type.OpenSearchTextType; +import org.opensearch.sql.opensearch.storage.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @ExtendWith(MockitoExtension.class) diff --git a/core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java similarity index 94% rename from core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java rename to opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java index 53a89d5421a..72a319dbfe6 100644 --- a/core/src/test/java/org/opensearch/sql/expression/serialization/DefaultExpressionSerializerTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/serialization/DefaultExpressionSerializerTest.java @@ -21,8 +21,6 @@ import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.ExpressionNodeVisitor; import org.opensearch.sql.expression.env.Environment; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; -import org.opensearch.sql.expression.serialization.ExpressionSerializer; @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class DefaultExpressionSerializerTest { diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java index 1439ed0e25f..3d733233be5 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java @@ -51,7 +51,6 @@ import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.datasource.DataSourceServiceImpl; import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; -import org.opensearch.sql.expression.serialization.DefaultExpressionSerializer; import org.opensearch.sql.legacy.esdomain.LocalClusterState; import org.opensearch.sql.legacy.executor.AsyncRestExecutor; import org.opensearch.sql.legacy.metrics.Metrics; @@ -62,6 +61,7 @@ import org.opensearch.sql.opensearch.setting.OpenSearchSettings; import org.opensearch.sql.opensearch.storage.OpenSearchDataSourceFactory; import org.opensearch.sql.opensearch.storage.script.ExpressionScriptEngine; +import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.plugin.config.OpenSearchPluginModule; import org.opensearch.sql.plugin.datasource.DataSourceSettings; import org.opensearch.sql.plugin.datasource.DataSourceUserAuthorizationHelperImpl; diff --git a/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java b/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java index b5cb5984a17..047e297c266 100644 --- a/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java +++ b/protocol/src/test/java/org/opensearch/sql/protocol/response/format/JdbcResponseFormatterTest.java @@ -97,7 +97,7 @@ void format_response_with_cursor() { .put("address", "Seattle") .put("age", 20) .build())), - new Cursor("test_cursor".getBytes()), 42); + new Cursor("test_cursor"), 42); assertJsonEquals( "{" From e42bcd4576ec7ab689c38b15d5682a75980d413b Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 12 Apr 2023 17:25:09 -0700 Subject: [PATCH 09/17] Resolve merge conflicts and fix tests. Signed-off-by: Yury-Fridlyand --- common/build.gradle | 1 - config/checkstyle/google_checks.xml | 3 +- .../org/opensearch/sql/analysis/Analyzer.java | 15 + .../sql/analysis/ExpressionAnalyzer.java | 106 ++- .../ExpressionReferenceOptimizer.java | 14 +- .../sql/analysis/NestedAnalyzer.java | 111 +++ .../sql/analysis/TypeEnvironment.java | 25 +- .../sql/ast/AbstractNodeVisitor.java | 5 + .../org/opensearch/sql/ast/dsl/AstDSL.java | 7 +- .../sql/ast/expression/ScoreFunction.java | 36 + .../sql/datasource/DataSourceService.java | 14 +- .../org/opensearch/sql/executor/Explain.java | 7 + .../org/opensearch/sql/expression/DSL.java | 18 +- .../sql/expression/ReferenceExpression.java | 8 +- .../function/BuiltinFunctionName.java | 8 + .../function/OpenSearchFunctions.java | 54 +- .../sql/planner/DefaultImplementor.java | 7 + .../sql/planner/logical/LogicalNested.java | 49 ++ .../sql/planner/logical/LogicalPlanDSL.java | 8 + .../logical/LogicalPlanNodeVisitor.java | 4 + .../optimizer/LogicalPlanOptimizer.java | 1 + .../planner/optimizer/pattern/Patterns.java | 8 + .../rule/read/TableScanPushDown.java | 6 + .../sql/planner/physical/NestedOperator.java | 291 ++++++++ .../sql/planner/physical/PhysicalPlanDSL.java | 8 + .../physical/PhysicalPlanNodeVisitor.java | 4 + .../datasource/DataSourceTableScan.java | 2 +- .../sql/storage/DataSourceFactory.java | 1 + .../org/opensearch/sql/storage/Table.java | 7 + .../sql/storage/read/TableScanBuilder.java | 12 + .../opensearch/sql/analysis/AnalyzerTest.java | 395 +++++++++++ .../sql/analysis/AnalyzerTestBase.java | 11 +- .../sql/analysis/ExpressionAnalyzerTest.java | 175 ++++- .../org/opensearch/sql/config/TestConfig.java | 5 + .../opensearch/sql/executor/ExplainTest.java | 71 +- .../expression/ReferenceExpressionTest.java | 36 +- .../function/OpenSearchFunctionsTest.java | 20 +- .../sql/planner/DefaultImplementorTest.java | 112 +-- .../logical/LogicalPlanNodeVisitorTest.java | 16 +- .../optimizer/LogicalPlanOptimizerTest.java | 23 + .../planner/physical/NestedOperatorTest.java | 347 +++++++++ .../physical/PhysicalPlanNodeVisitorTest.java | 10 + .../planner/physical/ProjectOperatorTest.java | 2 +- .../datasource/DataSourceTableScanTest.java | 2 +- datasources/build.gradle | 82 +++ datasources/lombok.config | 3 + .../datasources}/auth/AuthenticationType.java | 2 +- .../DataSourceUserAuthorizationHelper.java | 8 +- ...DataSourceUserAuthorizationHelperImpl.java | 11 +- .../sql/datasources}/encryptor/Encryptor.java | 2 +- .../datasources}/encryptor/EncryptorImpl.java | 10 +- .../DataSourceNotFoundException.java | 18 + .../datasources/exceptions/ErrorMessage.java | 78 ++ .../CreateDataSourceActionRequest.java | 2 +- .../CreateDataSourceActionResponse.java | 2 +- .../DeleteDataSourceActionRequest.java | 51 ++ .../DeleteDataSourceActionResponse.java | 33 + .../transport/GetDataSourceActionRequest.java | 49 ++ .../GetDataSourceActionResponse.java | 33 + .../UpdateDataSourceActionRequest.java | 47 ++ .../UpdateDataSourceActionResponse.java | 33 + .../rest/RestDataSourceQueryAction.java | 249 +++++++ .../service/DataSourceLoaderCache.java | 20 + .../service/DataSourceLoaderCacheImpl.java | 50 ++ .../service}/DataSourceMetadataStorage.java | 3 +- .../service}/DataSourceServiceImpl.java | 89 ++- .../settings}/DataSourceSettings.java | 2 +- .../OpenSearchDataSourceMetadataStorage.java | 130 +++- .../TransportCreateDataSourceAction.java | 39 +- .../TransportDeleteDataSourceAction.java | 59 ++ .../TransportGetDataSourceAction.java | 95 +++ .../TransportUpdateDataSourceAction.java | 59 ++ .../sql/datasources}/utils/Scheduler.java | 2 +- .../utils/XContentParserUtils.java | 9 +- .../resources/datasources-index-mapping.yml | 0 .../resources/datasources-index-settings.yml | 0 .../auth/AuthenticationTypeTest.java | 2 +- ...SourceUserAuthorizationHelperImplTest.java | 46 +- .../encryptor/EncryptorImplTest.java | 87 +++ .../DataSourceLoaderCacheImplTest.java | 85 +++ .../service}/DataSourceServiceImplTest.java | 136 +++- ...enSearchDataSourceMetadataStorageTest.java | 670 ++++++++++++++++++ .../TransportCreateDataSourceActionTest.java | 86 +++ .../TransportDeleteDataSourceActionTest.java | 78 ++ .../TransportGetDataSourceActionTest.java | 137 ++++ .../TransportUpdateDataSourceActionTest.java | 87 +++ .../sql/datasources}/utils/SchedulerTest.java | 26 +- .../utils/XContentParserUtilsTest.java | 101 +++ docs/user/beyond/partiql.rst | 10 +- docs/user/dql/basics.rst | 40 ++ docs/user/dql/functions.rst | 65 ++ docs/user/dql/metadata.rst | 3 +- doctest/test_data/nested_objects.json | 4 + doctest/test_docs.py | 4 +- doctest/test_mapping/nested_objects.json | 47 ++ .../sql/datasource/DataSourceAPIsIT.java | 181 ++++- .../sql/legacy/CsvFormatResponseIT.java | 8 +- .../opensearch/sql/legacy/MethodQueryIT.java | 6 +- .../sql/legacy/ObjectFieldSelectIT.java | 3 +- .../sql/legacy/PrettyFormatResponseIT.java | 7 +- .../sql/legacy/SQLIntegTestCase.java | 15 +- .../opensearch/sql/legacy/TestsConstants.java | 4 + .../org/opensearch/sql/ppl/StandaloneIT.java | 18 +- .../org/opensearch/sql/sql/IdentifierIT.java | 65 ++ .../java/org/opensearch/sql/sql/MatchIT.java | 14 + .../java/org/opensearch/sql/sql/NestedIT.java | 260 +++++++ .../org/opensearch/sql/sql/ScoreQueryIT.java | 142 ++++ .../sql/sql/StandalonePaginationIT.java | 2 +- .../indexDefinitions/multi_nested.json | 42 ++ .../test/resources/multi_nested_objects.json | 10 + .../nested_objects_without_arrays.json | 10 + .../src/test/resources/nested_with_nulls.json | 24 + .../value/OpenSearchExprValueFactory.java | 28 +- .../OpenSearchExecutionProtector.java | 10 + .../request/ContinuePageRequest.java | 3 +- .../request/ContinuePageRequestBuilder.java | 10 + .../request/InitialPageRequestBuilder.java | 10 + .../request/OpenSearchQueryRequest.java | 12 +- .../request/OpenSearchRequestBuilder.java | 92 ++- .../request/OpenSearchScrollRequest.java | 9 +- .../request/PushDownRequestBuilder.java | 4 + .../response/OpenSearchResponse.java | 68 +- .../opensearch/storage/OpenSearchIndex.java | 29 +- .../storage/scan/OpenSearchIndexScan.java | 25 +- .../scan/OpenSearchIndexScanBuilder.java | 6 + .../scan/OpenSearchIndexScanQueryBuilder.java | 32 +- .../storage/script/sort/SortQueryBuilder.java | 3 + .../client/OpenSearchNodeClientTest.java | 3 +- .../client/OpenSearchRestClientTest.java | 3 +- .../value/OpenSearchExprValueFactoryTest.java | 19 + .../OpenSearchExecutionEngineTest.java | 5 +- .../OpenSearchExecutionProtectorTest.java | 37 +- .../ContinuePageRequestBuilderTest.java | 7 +- .../InitialPageRequestBuilderTest.java | 7 +- .../request/OpenSearchQueryRequestTest.java | 61 ++ .../request/OpenSearchRequestBuilderTest.java | 114 ++- .../request/OpenSearchScrollRequestTest.java | 86 +++ .../response/OpenSearchResponseTest.java | 174 ++++- .../storage/OpenSearchIndexTest.java | 35 +- .../OpenSearchIndexScanOptimizationTest.java | 184 +++++ .../storage/scan/OpenSearchIndexScanTest.java | 36 +- plugin/build.gradle | 4 +- .../org/opensearch/sql/plugin/SQLPlugin.java | 71 +- .../rest/RestDataSourceQueryAction.java | 131 ---- .../transport/TransportPPLQueryAction.java | 2 +- ...enSearchDataSourceMetadataStorageTest.java | 220 ------ prometheus/build.gradle | 1 + .../storage/PrometheusStorageFactory.java | 40 +- .../storage/PrometheusStorageFactoryTest.java | 19 + settings.gradle | 1 + sql/src/main/antlr/OpenSearchSQLLexer.g4 | 8 +- sql/src/main/antlr/OpenSearchSQLParser.g4 | 13 + .../sql/sql/parser/AstExpressionBuilder.java | 33 +- .../sql/sql/antlr/SQLSyntaxParserTest.java | 12 + .../sql/sql/parser/AstBuilderTest.java | 9 - .../sql/parser/AstExpressionBuilderTest.java | 51 ++ 156 files changed, 6756 insertions(+), 831 deletions(-) create mode 100644 core/src/main/java/org/opensearch/sql/analysis/NestedAnalyzer.java create mode 100644 core/src/main/java/org/opensearch/sql/ast/expression/ScoreFunction.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/logical/LogicalNested.java create mode 100644 core/src/main/java/org/opensearch/sql/planner/physical/NestedOperator.java create mode 100644 core/src/test/java/org/opensearch/sql/planner/physical/NestedOperatorTest.java create mode 100644 datasources/build.gradle create mode 100644 datasources/lombok.config rename {core/src/main/java/org/opensearch/sql/datasource/model => datasources/src/main/java/org/opensearch/sql/datasources}/auth/AuthenticationType.java (94%) rename {core/src/main/java/org/opensearch/sql/datasource => datasources/src/main/java/org/opensearch/sql/datasources/auth}/DataSourceUserAuthorizationHelper.java (79%) rename {plugin/src/main/java/org/opensearch/sql/plugin/datasource => datasources/src/main/java/org/opensearch/sql/datasources/auth}/DataSourceUserAuthorizationHelperImpl.java (80%) rename {common/src/main/java/org/opensearch/sql/common => datasources/src/main/java/org/opensearch/sql/datasources}/encryptor/Encryptor.java (90%) rename {common/src/main/java/org/opensearch/sql/common => datasources/src/main/java/org/opensearch/sql/datasources}/encryptor/EncryptorImpl.java (87%) create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/exceptions/DataSourceNotFoundException.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/exceptions/ErrorMessage.java rename {plugin/src/main/java/org/opensearch/sql/plugin/model => datasources/src/main/java/org/opensearch/sql/datasources/model/transport}/CreateDataSourceActionRequest.java (96%) rename {plugin/src/main/java/org/opensearch/sql/plugin/model => datasources/src/main/java/org/opensearch/sql/datasources/model/transport}/CreateDataSourceActionResponse.java (92%) create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionRequest.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionResponse.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionRequest.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionResponse.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionRequest.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionResponse.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/rest/RestDataSourceQueryAction.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCache.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImpl.java rename {core/src/main/java/org/opensearch/sql/datasource => datasources/src/main/java/org/opensearch/sql/datasources/service}/DataSourceMetadataStorage.java (95%) rename {core/src/main/java/org/opensearch/sql/datasource => datasources/src/main/java/org/opensearch/sql/datasources/service}/DataSourceServiceImpl.java (60%) rename {plugin/src/main/java/org/opensearch/sql/plugin/datasource => datasources/src/main/java/org/opensearch/sql/datasources/settings}/DataSourceSettings.java (92%) rename {plugin/src/main/java/org/opensearch/sql/plugin/datasource => datasources/src/main/java/org/opensearch/sql/datasources/storage}/OpenSearchDataSourceMetadataStorage.java (63%) rename {plugin/src/main/java/org/opensearch/sql/plugin/transport/datasource => datasources/src/main/java/org/opensearch/sql/datasources/transport}/TransportCreateDataSourceAction.java (57%) create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceAction.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceAction.java create mode 100644 datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceAction.java rename {plugin/src/main/java/org/opensearch/sql/plugin => datasources/src/main/java/org/opensearch/sql/datasources}/utils/Scheduler.java (95%) rename {plugin/src/main/java/org/opensearch/sql/plugin => datasources/src/main/java/org/opensearch/sql/datasources}/utils/XContentParserUtils.java (94%) rename {plugin => datasources}/src/main/resources/datasources-index-mapping.yml (100%) rename {plugin => datasources}/src/main/resources/datasources-index-settings.yml (100%) rename {core/src/test/java/org/opensearch/sql/datasource/model => datasources/src/test/java/org/opensearch/sql/datasources}/auth/AuthenticationTypeTest.java (93%) rename {plugin/src/test/java/org/opensearch/sql/plugin/datasource => datasources/src/test/java/org/opensearch/sql/datasources/auth}/DataSourceUserAuthorizationHelperImplTest.java (61%) create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/encryptor/EncryptorImplTest.java create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImplTest.java rename {core/src/test/java/org/opensearch/sql/datasource => datasources/src/test/java/org/opensearch/sql/datasources/service}/DataSourceServiceImplTest.java (62%) create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorageTest.java create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceActionTest.java create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceActionTest.java create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceActionTest.java create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceActionTest.java rename {plugin/src/test/java/org/opensearch/sql/plugin => datasources/src/test/java/org/opensearch/sql/datasources}/utils/SchedulerTest.java (56%) create mode 100644 datasources/src/test/java/org/opensearch/sql/datasources/utils/XContentParserUtilsTest.java create mode 100644 doctest/test_data/nested_objects.json create mode 100644 doctest/test_mapping/nested_objects.json create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/NestedIT.java create mode 100644 integ-test/src/test/java/org/opensearch/sql/sql/ScoreQueryIT.java create mode 100644 integ-test/src/test/resources/indexDefinitions/multi_nested.json create mode 100644 integ-test/src/test/resources/multi_nested_objects.json create mode 100644 integ-test/src/test/resources/nested_objects_without_arrays.json create mode 100644 integ-test/src/test/resources/nested_with_nulls.json delete mode 100644 plugin/src/main/java/org/opensearch/sql/plugin/rest/RestDataSourceQueryAction.java delete mode 100644 plugin/src/test/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorageTest.java diff --git a/common/build.gradle b/common/build.gradle index da6b5919615..369a649cde0 100644 --- a/common/build.gradle +++ b/common/build.gradle @@ -36,7 +36,6 @@ dependencies { api group: 'com.google.guava', name: 'guava', version: '31.0.1-jre' api group: 'org.apache.logging.log4j', name: 'log4j-core', version:'2.17.1' api group: 'org.apache.commons', name: 'commons-lang3', version: '3.12.0' - api 'com.amazonaws:aws-encryption-sdk-java:2.4.0' testImplementation group: 'junit', name: 'junit', version: '4.13.2' testImplementation group: 'org.assertj', name: 'assertj-core', version: '3.9.1' diff --git a/config/checkstyle/google_checks.xml b/config/checkstyle/google_checks.xml index a0c7d90fd9b..12c90f8495c 100644 --- a/config/checkstyle/google_checks.xml +++ b/config/checkstyle/google_checks.xml @@ -39,8 +39,9 @@ - + + diff --git a/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java b/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java index 0c1be4319bd..5383d82418e 100644 --- a/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java +++ b/core/src/main/java/org/opensearch/sql/analysis/Analyzer.java @@ -63,6 +63,7 @@ import org.opensearch.sql.ast.tree.Values; import org.opensearch.sql.data.model.ExprMissingValue; import org.opensearch.sql.data.type.ExprCoreType; +import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.exception.SemanticCheckException; import org.opensearch.sql.expression.DSL; @@ -152,6 +153,9 @@ public LogicalPlan visitRelation(Relation node, AnalysisContext context) { dataSourceSchemaIdentifierNameResolver.getIdentifierName()); } table.getFieldTypes().forEach((k, v) -> curEnv.define(new Symbol(Namespace.FIELD_NAME, k), v)); + table.getReservedFieldTypes().forEach( + (k, v) -> curEnv.addReservedWord(new Symbol(Namespace.FIELD_NAME, k), v) + ); // Put index name or its alias in index namespace on type environment so qualifier // can be removed when analyzing qualified name. The value (expr type) here doesn't matter. @@ -195,6 +199,9 @@ public LogicalPlan visitTableFunction(TableFunction node, AnalysisContext contex TypeEnvironment curEnv = context.peek(); Table table = tableFunctionImplementation.applyArguments(); table.getFieldTypes().forEach((k, v) -> curEnv.define(new Symbol(Namespace.FIELD_NAME, k), v)); + table.getReservedFieldTypes().forEach( + (k, v) -> curEnv.addReservedWord(new Symbol(Namespace.FIELD_NAME, k), v) + ); curEnv.define(new Symbol(Namespace.INDEX_NAME, dataSourceSchemaIdentifierNameResolver.getIdentifierName()), STRUCT); return new LogicalRelation(dataSourceSchemaIdentifierNameResolver.getIdentifierName(), @@ -361,6 +368,14 @@ public LogicalPlan visitProject(Project node, AnalysisContext context) { List namedExpressions = selectExpressionAnalyzer.analyze(node.getProjectList(), context, new ExpressionReferenceOptimizer(expressionAnalyzer.getRepository(), child)); + + for (UnresolvedExpression expr : node.getProjectList()) { + NestedAnalyzer nestedAnalyzer = new NestedAnalyzer( + namedExpressions, expressionAnalyzer, child + ); + child = nestedAnalyzer.analyze(expr, context); + } + // new context context.push(); TypeEnvironment newEnv = context.peek(); diff --git a/core/src/main/java/org/opensearch/sql/analysis/ExpressionAnalyzer.java b/core/src/main/java/org/opensearch/sql/analysis/ExpressionAnalyzer.java index ff3c01d5b8d..43155a868a8 100644 --- a/core/src/main/java/org/opensearch/sql/analysis/ExpressionAnalyzer.java +++ b/core/src/main/java/org/opensearch/sql/analysis/ExpressionAnalyzer.java @@ -8,8 +8,6 @@ import static org.opensearch.sql.ast.dsl.AstDSL.and; import static org.opensearch.sql.ast.dsl.AstDSL.compare; -import static org.opensearch.sql.expression.function.BuiltinFunctionName.GTE; -import static org.opensearch.sql.expression.function.BuiltinFunctionName.LTE; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -31,6 +29,7 @@ import org.opensearch.sql.ast.expression.Case; import org.opensearch.sql.ast.expression.Cast; import org.opensearch.sql.ast.expression.Compare; +import org.opensearch.sql.ast.expression.DataType; import org.opensearch.sql.ast.expression.EqualTo; import org.opensearch.sql.ast.expression.Field; import org.opensearch.sql.ast.expression.Function; @@ -42,6 +41,7 @@ import org.opensearch.sql.ast.expression.Or; import org.opensearch.sql.ast.expression.QualifiedName; import org.opensearch.sql.ast.expression.RelevanceFieldList; +import org.opensearch.sql.ast.expression.ScoreFunction; import org.opensearch.sql.ast.expression.Span; import org.opensearch.sql.ast.expression.UnresolvedArgument; import org.opensearch.sql.ast.expression.UnresolvedAttribute; @@ -51,6 +51,7 @@ import org.opensearch.sql.ast.expression.Xor; import org.opensearch.sql.common.antlr.SyntaxCheckException; import org.opensearch.sql.data.model.ExprValueUtils; +import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.exception.SemanticCheckException; import org.opensearch.sql.expression.DSL; @@ -67,6 +68,7 @@ import org.opensearch.sql.expression.function.BuiltinFunctionName; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; import org.opensearch.sql.expression.function.FunctionName; +import org.opensearch.sql.expression.function.OpenSearchFunctions; import org.opensearch.sql.expression.parse.ParseExpression; import org.opensearch.sql.expression.span.SpanExpression; import org.opensearch.sql.expression.window.aggregation.AggregateWindowFunction; @@ -207,6 +209,65 @@ public Expression visitHighlightFunction(HighlightFunction node, AnalysisContext return new HighlightExpression(expr); } + /** + * visitScoreFunction removes the score function from the AST and replaces it with the child + * relevance function node. If the optional boost variable is provided, the boost argument + * of the relevance function is combined. + * + * @param node score function node + * @param context analysis context for the query + * @return resolved relevance function + */ + public Expression visitScoreFunction(ScoreFunction node, AnalysisContext context) { + Literal boostArg = node.getRelevanceFieldWeight(); + if (!boostArg.getType().equals(DataType.DOUBLE)) { + throw new SemanticCheckException(String.format("Expected boost type '%s' but got '%s'", + DataType.DOUBLE.name(), boostArg.getType().name())); + } + Double thisBoostValue = ((Double) boostArg.getValue()); + + // update the existing unresolved expression to add a boost argument if it doesn't exist + // OR multiply the existing boost argument + Function relevanceQueryUnresolvedExpr = (Function) node.getRelevanceQuery(); + List relevanceFuncArgs = relevanceQueryUnresolvedExpr.getFuncArgs(); + + boolean doesFunctionContainBoostArgument = false; + List updatedFuncArgs = new ArrayList<>(); + for (UnresolvedExpression expr : relevanceFuncArgs) { + String argumentName = ((UnresolvedArgument) expr).getArgName(); + if (argumentName.equalsIgnoreCase("boost")) { + doesFunctionContainBoostArgument = true; + Literal boostArgLiteral = (Literal) ((UnresolvedArgument) expr).getValue(); + Double boostValue = + Double.parseDouble((String) boostArgLiteral.getValue()) * thisBoostValue; + UnresolvedArgument newBoostArg = new UnresolvedArgument( + argumentName, + new Literal(boostValue.toString(), DataType.STRING) + ); + updatedFuncArgs.add(newBoostArg); + } else { + updatedFuncArgs.add(expr); + } + } + + // since nothing was found, add an argument + if (!doesFunctionContainBoostArgument) { + UnresolvedArgument newBoostArg = new UnresolvedArgument( + "boost", new Literal(Double.toString(thisBoostValue), DataType.STRING)); + updatedFuncArgs.add(newBoostArg); + } + + // create a new function expression with boost argument and resolve it + Function updatedRelevanceQueryUnresolvedExpr = new Function( + relevanceQueryUnresolvedExpr.getFuncName(), + updatedFuncArgs); + OpenSearchFunctions.OpenSearchFunction relevanceQueryExpr = + (OpenSearchFunctions.OpenSearchFunction) updatedRelevanceQueryUnresolvedExpr + .accept(this, context); + relevanceQueryExpr.setScoreTracked(true); + return relevanceQueryExpr; + } + @Override public Expression visitIn(In node, AnalysisContext context) { return visitIn(node.getField(), node.getValueList(), context); @@ -297,6 +358,23 @@ public Expression visitAllFields(AllFields node, AnalysisContext context) { @Override public Expression visitQualifiedName(QualifiedName node, AnalysisContext context) { QualifierAnalyzer qualifierAnalyzer = new QualifierAnalyzer(context); + + // check for reserved words in the identifier + for (String part : node.getParts()) { + for (TypeEnvironment typeEnv = context.peek(); + typeEnv != null; + typeEnv = typeEnv.getParent()) { + Optional exprType = typeEnv.getReservedSymbolTable().lookup( + new Symbol(Namespace.FIELD_NAME, part)); + if (exprType.isPresent()) { + return visitMetadata( + qualifierAnalyzer.unqualified(node), + (ExprCoreType) exprType.get(), + context + ); + } + } + } return visitIdentifier(qualifierAnalyzer.unqualified(node), context); } @@ -313,6 +391,19 @@ public Expression visitUnresolvedArgument(UnresolvedArgument node, AnalysisConte return new NamedArgumentExpression(node.getArgName(), node.getValue().accept(this, context)); } + /** + * If QualifiedName is actually a reserved metadata field, return the expr type associated + * with the metadata field. + * @param ident metadata field name + * @param context analysis context + * @return DSL reference + */ + private Expression visitMetadata(String ident, + ExprCoreType exprCoreType, + AnalysisContext context) { + return DSL.ref(ident, exprCoreType); + } + private Expression visitIdentifier(String ident, AnalysisContext context) { // ParseExpression will always override ReferenceExpression when ident conflicts for (NamedExpression expr : context.getNamedParseExpressions()) { @@ -325,17 +416,6 @@ private Expression visitIdentifier(String ident, AnalysisContext context) { ReferenceExpression ref = DSL.ref(ident, typeEnv.resolve(new Symbol(Namespace.FIELD_NAME, ident))); - // Fall back to old engine too if type is not supported semantically - if (isTypeNotSupported(ref.type())) { - throw new SyntaxCheckException(String.format( - "Identifier [%s] of type [%s] is not supported yet", ident, ref.type())); - } return ref; } - - // Array type is not supporte yet. - private boolean isTypeNotSupported(ExprType type) { - return "array".equalsIgnoreCase(type.typeName()); - } - } diff --git a/core/src/main/java/org/opensearch/sql/analysis/ExpressionReferenceOptimizer.java b/core/src/main/java/org/opensearch/sql/analysis/ExpressionReferenceOptimizer.java index f75bcd5a1d8..eaf5c4abca0 100644 --- a/core/src/main/java/org/opensearch/sql/analysis/ExpressionReferenceOptimizer.java +++ b/core/src/main/java/org/opensearch/sql/analysis/ExpressionReferenceOptimizer.java @@ -19,6 +19,7 @@ import org.opensearch.sql.expression.conditional.cases.CaseClause; import org.opensearch.sql.expression.conditional.cases.WhenClause; import org.opensearch.sql.expression.function.BuiltinFunctionRepository; +import org.opensearch.sql.expression.function.OpenSearchFunctions; import org.opensearch.sql.planner.logical.LogicalAggregation; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanNodeVisitor; @@ -70,8 +71,17 @@ public Expression visitFunction(FunctionExpression node, AnalysisContext context final List args = node.getArguments().stream().map(expr -> expr.accept(this, context)) .collect(Collectors.toList()); - return (Expression) repository.compile(context.getFunctionProperties(), - node.getFunctionName(), args); + Expression optimizedFunctionExpression = (Expression) repository.compile( + context.getFunctionProperties(), + node.getFunctionName(), + args + ); + // Propagate scoreTracked for OpenSearch functions + if (optimizedFunctionExpression instanceof OpenSearchFunctions.OpenSearchFunction) { + ((OpenSearchFunctions.OpenSearchFunction) optimizedFunctionExpression).setScoreTracked( + ((OpenSearchFunctions.OpenSearchFunction)node).isScoreTracked()); + } + return optimizedFunctionExpression; } } diff --git a/core/src/main/java/org/opensearch/sql/analysis/NestedAnalyzer.java b/core/src/main/java/org/opensearch/sql/analysis/NestedAnalyzer.java new file mode 100644 index 00000000000..756c1f20b34 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/analysis/NestedAnalyzer.java @@ -0,0 +1,111 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.analysis; + +import static org.opensearch.sql.data.type.ExprCoreType.STRING; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.Map; +import lombok.RequiredArgsConstructor; +import org.opensearch.sql.ast.AbstractNodeVisitor; +import org.opensearch.sql.ast.expression.Alias; +import org.opensearch.sql.ast.expression.Function; +import org.opensearch.sql.ast.expression.QualifiedName; +import org.opensearch.sql.ast.expression.UnresolvedExpression; +import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.function.BuiltinFunctionName; +import org.opensearch.sql.planner.logical.LogicalNested; +import org.opensearch.sql.planner.logical.LogicalPlan; + +/** + * Analyze the Nested Function in the {@link AnalysisContext} to construct the {@link + * LogicalPlan}. + */ +@RequiredArgsConstructor +public class NestedAnalyzer extends AbstractNodeVisitor { + private final List namedExpressions; + private final ExpressionAnalyzer expressionAnalyzer; + private final LogicalPlan child; + + public LogicalPlan analyze(UnresolvedExpression projectItem, AnalysisContext context) { + LogicalPlan nested = projectItem.accept(this, context); + return (nested == null) ? child : nested; + } + + @Override + public LogicalPlan visitAlias(Alias node, AnalysisContext context) { + return node.getDelegated().accept(this, context); + } + + @Override + public LogicalPlan visitFunction(Function node, AnalysisContext context) { + if (node.getFuncName().equalsIgnoreCase(BuiltinFunctionName.NESTED.name())) { + + List expressions = node.getFuncArgs(); + validateArgs(expressions); + ReferenceExpression nestedField = + (ReferenceExpression)expressionAnalyzer.analyze(expressions.get(0), context); + Map args; + if (expressions.size() == 2) { + args = Map.of( + "field", nestedField, + "path", (ReferenceExpression)expressionAnalyzer.analyze(expressions.get(1), context) + ); + } else { + args = Map.of( + "field", (ReferenceExpression)expressionAnalyzer.analyze(expressions.get(0), context), + "path", generatePath(nestedField.toString()) + ); + } + if (child instanceof LogicalNested) { + ((LogicalNested)child).addFields(args); + return child; + } else { + return new LogicalNested(child, new ArrayList<>(Arrays.asList(args)), namedExpressions); + } + } + return null; + } + + /** + * Validate each parameter used in nested function in SELECT clause. Any supplied parameter + * for a nested function in a SELECT statement must be a valid qualified name, and the field + * parameter must be nested at least one level. + * @param args : Arguments in nested function. + */ + private void validateArgs(List args) { + if (args.size() < 1 || args.size() > 2) { + throw new IllegalArgumentException( + "on nested object only allowed 2 parameters (field,path) or 1 parameter (field)" + ); + } + + for (int i = 0; i < args.size(); i++) { + if (!(args.get(i) instanceof QualifiedName)) { + throw new IllegalArgumentException( + String.format("Illegal nested field name: %s", args.get(i).toString()) + ); + } + if (i == 0 && ((QualifiedName)args.get(i)).getParts().size() < 2) { + throw new IllegalArgumentException( + String.format("Illegal nested field name: %s", args.get(i).toString()) + ); + } + } + } + + /** + * Generate nested path dynamically. Assumes at least one level of nesting in supplied string. + * @param field : Nested field to generate path of. + * @return : Path of field derived from last level of nesting. + */ + private ReferenceExpression generatePath(String field) { + return new ReferenceExpression(field.substring(0, field.lastIndexOf(".")), STRING); + } +} diff --git a/core/src/main/java/org/opensearch/sql/analysis/TypeEnvironment.java b/core/src/main/java/org/opensearch/sql/analysis/TypeEnvironment.java index c86d8109ad0..c9fd8030e05 100644 --- a/core/src/main/java/org/opensearch/sql/analysis/TypeEnvironment.java +++ b/core/src/main/java/org/opensearch/sql/analysis/TypeEnvironment.java @@ -29,14 +29,30 @@ public class TypeEnvironment implements Environment { private final TypeEnvironment parent; private final SymbolTable symbolTable; + @Getter + private final SymbolTable reservedSymbolTable; + + /** + * Constructor with empty symbol tables. + * + * @param parent parent environment + */ public TypeEnvironment(TypeEnvironment parent) { this.parent = parent; this.symbolTable = new SymbolTable(); + this.reservedSymbolTable = new SymbolTable(); } + /** + * Constructor with empty reserved symbol table. + * + * @param parent parent environment + * @param symbolTable type table + */ public TypeEnvironment(TypeEnvironment parent, SymbolTable symbolTable) { this.parent = parent; this.symbolTable = symbolTable; + this.reservedSymbolTable = new SymbolTable(); } /** @@ -59,6 +75,7 @@ public ExprType resolve(Symbol symbol) { /** * Resolve all fields in the current environment. + * * @param namespace a namespace * @return all symbols in the namespace */ @@ -102,7 +119,11 @@ public void remove(ReferenceExpression ref) { * Clear all fields in the current environment. */ public void clearAllFields() { - lookupAllFields(FIELD_NAME).keySet().stream() - .forEach(v -> remove(new Symbol(Namespace.FIELD_NAME, v))); + lookupAllFields(FIELD_NAME).keySet().forEach( + v -> remove(new Symbol(Namespace.FIELD_NAME, v))); + } + + public void addReservedWord(Symbol symbol, ExprType type) { + reservedSymbolTable.store(symbol, type); } } diff --git a/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java b/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java index adcde61d426..9c283d95f6d 100644 --- a/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java @@ -29,6 +29,7 @@ import org.opensearch.sql.ast.expression.Or; import org.opensearch.sql.ast.expression.QualifiedName; import org.opensearch.sql.ast.expression.RelevanceFieldList; +import org.opensearch.sql.ast.expression.ScoreFunction; import org.opensearch.sql.ast.expression.Span; import org.opensearch.sql.ast.expression.UnresolvedArgument; import org.opensearch.sql.ast.expression.UnresolvedAttribute; @@ -279,6 +280,10 @@ public T visitHighlightFunction(HighlightFunction node, C context) { return visitChildren(node, context); } + public T visitScoreFunction(ScoreFunction node, C context) { + return visitChildren(node, context); + } + public T visitStatement(Statement node, C context) { return visit(node, context); } diff --git a/core/src/main/java/org/opensearch/sql/ast/dsl/AstDSL.java b/core/src/main/java/org/opensearch/sql/ast/dsl/AstDSL.java index 039b6380f7e..de2ab5404a8 100644 --- a/core/src/main/java/org/opensearch/sql/ast/dsl/AstDSL.java +++ b/core/src/main/java/org/opensearch/sql/ast/dsl/AstDSL.java @@ -34,6 +34,7 @@ import org.opensearch.sql.ast.expression.Or; import org.opensearch.sql.ast.expression.ParseMethod; import org.opensearch.sql.ast.expression.QualifiedName; +import org.opensearch.sql.ast.expression.ScoreFunction; import org.opensearch.sql.ast.expression.Span; import org.opensearch.sql.ast.expression.SpanUnit; import org.opensearch.sql.ast.expression.UnresolvedArgument; @@ -60,7 +61,6 @@ import org.opensearch.sql.ast.tree.TableFunction; import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.ast.tree.Values; -import org.opensearch.sql.expression.function.BuiltinFunctionName; /** * Class of static methods to create specific node instances. @@ -285,6 +285,11 @@ public UnresolvedExpression highlight(UnresolvedExpression fieldName, return new HighlightFunction(fieldName, arguments); } + public UnresolvedExpression score(UnresolvedExpression relevanceQuery, + Literal relevanceFieldWeight) { + return new ScoreFunction(relevanceQuery, relevanceFieldWeight); + } + public UnresolvedExpression window(UnresolvedExpression function, List partitionByList, List> sortList) { diff --git a/core/src/main/java/org/opensearch/sql/ast/expression/ScoreFunction.java b/core/src/main/java/org/opensearch/sql/ast/expression/ScoreFunction.java new file mode 100644 index 00000000000..1b73f9bd951 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/ast/expression/ScoreFunction.java @@ -0,0 +1,36 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.ast.expression; + +import java.util.List; +import lombok.AllArgsConstructor; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.ToString; +import org.opensearch.sql.ast.AbstractNodeVisitor; + +/** + * Expression node of Score function. + * Score takes a relevance-search expression as an argument and returns it + */ +@AllArgsConstructor +@EqualsAndHashCode(callSuper = false) +@Getter +@ToString +public class ScoreFunction extends UnresolvedExpression { + private final UnresolvedExpression relevanceQuery; + private final Literal relevanceFieldWeight; + + @Override + public T accept(AbstractNodeVisitor nodeVisitor, C context) { + return nodeVisitor.visitScoreFunction(this, context); + } + + @Override + public List getChild() { + return List.of(relevanceQuery); + } +} diff --git a/core/src/main/java/org/opensearch/sql/datasource/DataSourceService.java b/core/src/main/java/org/opensearch/sql/datasource/DataSourceService.java index f621ce5c55a..9167737a70b 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/DataSourceService.java +++ b/core/src/main/java/org/opensearch/sql/datasource/DataSourceService.java @@ -27,9 +27,21 @@ public interface DataSourceService { * Returns all dataSource Metadata objects. The returned objects won't contain * any of the credential info. * + * @param isDefaultDataSourceRequired is used to specify + * if default opensearch connector is required in the output list. * @return set of {@link DataSourceMetadata}. */ - Set getDataSourceMetadataSet(); + Set getDataSourceMetadata(boolean isDefaultDataSourceRequired); + + + /** + * Returns dataSourceMetadata object with specific name. + * The returned objects won't contain any crendetial info. + * + * @param name name of the {@link DataSource}. + * @return set of {@link DataSourceMetadata}. + */ + DataSourceMetadata getDataSourceMetadata(String name); /** * Register {@link DataSource} defined by {@link DataSourceMetadata}. diff --git a/core/src/main/java/org/opensearch/sql/executor/Explain.java b/core/src/main/java/org/opensearch/sql/executor/Explain.java index db2f4bdb119..7c16e0b7200 100644 --- a/core/src/main/java/org/opensearch/sql/executor/Explain.java +++ b/core/src/main/java/org/opensearch/sql/executor/Explain.java @@ -23,6 +23,7 @@ import org.opensearch.sql.planner.physical.EvalOperator; import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; +import org.opensearch.sql.planner.physical.NestedOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanNodeVisitor; import org.opensearch.sql.planner.physical.ProjectOperator; @@ -142,6 +143,12 @@ public ExplainResponseNode visitLimit(LimitOperator node, Object context) { "limit", node.getLimit(), "offset", node.getOffset()))); } + @Override + public ExplainResponseNode visitNested(NestedOperator node, Object context) { + return explain(node, context, explanNode -> explanNode.setDescription(ImmutableMap.of( + "nested", node.getFields()))); + } + protected ExplainResponseNode explain(PhysicalPlan node, Object context, Consumer doExplain) { ExplainResponseNode explainNode = new ExplainResponseNode(getOperatorName(node)); diff --git a/core/src/main/java/org/opensearch/sql/expression/DSL.java b/core/src/main/java/org/opensearch/sql/expression/DSL.java index b866dfb7946..f9ef20e3057 100644 --- a/core/src/main/java/org/opensearch/sql/expression/DSL.java +++ b/core/src/main/java/org/opensearch/sql/expression/DSL.java @@ -624,6 +624,10 @@ public static FunctionExpression xor(Expression... expressions) { return compile(FunctionProperties.None, BuiltinFunctionName.XOR, expressions); } + public static FunctionExpression nested(Expression... expressions) { + return compile(FunctionProperties.None, BuiltinFunctionName.NESTED, expressions); + } + public static FunctionExpression not(Expression... expressions) { return compile(FunctionProperties.None, BuiltinFunctionName.NOT, expressions); } @@ -866,7 +870,19 @@ public static FunctionExpression match_bool_prefix(Expression... args) { } public static FunctionExpression wildcard_query(Expression... args) { - return compile(FunctionProperties.None,BuiltinFunctionName.WILDCARD_QUERY, args); + return compile(FunctionProperties.None, BuiltinFunctionName.WILDCARD_QUERY, args); + } + + public static FunctionExpression score(Expression... args) { + return compile(FunctionProperties.None, BuiltinFunctionName.SCORE, args); + } + + public static FunctionExpression scorequery(Expression... args) { + return compile(FunctionProperties.None, BuiltinFunctionName.SCOREQUERY, args); + } + + public static FunctionExpression score_query(Expression... args) { + return compile(FunctionProperties.None, BuiltinFunctionName.SCORE_QUERY, args); } public static FunctionExpression now(FunctionProperties functionProperties, diff --git a/core/src/main/java/org/opensearch/sql/expression/ReferenceExpression.java b/core/src/main/java/org/opensearch/sql/expression/ReferenceExpression.java index 94bb4e067d2..3c5b2af23cb 100644 --- a/core/src/main/java/org/opensearch/sql/expression/ReferenceExpression.java +++ b/core/src/main/java/org/opensearch/sql/expression/ReferenceExpression.java @@ -15,6 +15,7 @@ import lombok.RequiredArgsConstructor; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.env.Environment; @@ -100,7 +101,12 @@ public ExprValue resolve(ExprTupleValue value) { } private ExprValue resolve(ExprValue value, List paths) { - final ExprValue wholePathValue = value.keyValue(String.join(PATH_SEP, paths)); + ExprValue wholePathValue = value.keyValue(String.join(PATH_SEP, paths)); + // For array types only first index currently supported. + if (value.type().equals(ExprCoreType.ARRAY)) { + wholePathValue = value.collectionValue().get(0).keyValue(paths.get(0)); + } + if (!wholePathValue.isMissing() || paths.size() == 1) { return wholePathValue; } else { diff --git a/core/src/main/java/org/opensearch/sql/expression/function/BuiltinFunctionName.java b/core/src/main/java/org/opensearch/sql/expression/function/BuiltinFunctionName.java index 994ddbd559c..728712f5372 100644 --- a/core/src/main/java/org/opensearch/sql/expression/function/BuiltinFunctionName.java +++ b/core/src/main/java/org/opensearch/sql/expression/function/BuiltinFunctionName.java @@ -123,6 +123,7 @@ public enum BuiltinFunctionName { WEEK_OF_YEAR(FunctionName.of("week_of_year")), YEAR(FunctionName.of("year")), YEARWEEK(FunctionName.of("yearweek")), + // `now`-like functions NOW(FunctionName.of("now")), CURDATE(FunctionName.of("curdate")), @@ -133,6 +134,7 @@ public enum BuiltinFunctionName { CURRENT_TIMESTAMP(FunctionName.of("current_timestamp")), LOCALTIMESTAMP(FunctionName.of("localtimestamp")), SYSDATE(FunctionName.of("sysdate")), + /** * Text Functions. */ @@ -187,6 +189,8 @@ public enum BuiltinFunctionName { STDDEV_POP(FunctionName.of("stddev_pop")), // take top documents from aggregation bucket. TAKE(FunctionName.of("take")), + // Not always an aggregation query + NESTED(FunctionName.of("nested")), /** * Text Functions. @@ -256,6 +260,10 @@ public enum BuiltinFunctionName { MATCH_BOOL_PREFIX(FunctionName.of("match_bool_prefix")), HIGHLIGHT(FunctionName.of("highlight")), MATCH_PHRASE_PREFIX(FunctionName.of("match_phrase_prefix")), + SCORE(FunctionName.of("score")), + SCOREQUERY(FunctionName.of("scorequery")), + SCORE_QUERY(FunctionName.of("score_query")), + /** * Legacy Relevance Function. */ diff --git a/core/src/main/java/org/opensearch/sql/expression/function/OpenSearchFunctions.java b/core/src/main/java/org/opensearch/sql/expression/function/OpenSearchFunctions.java index 842cf25cd63..c5fcb010f5f 100644 --- a/core/src/main/java/org/opensearch/sql/expression/function/OpenSearchFunctions.java +++ b/core/src/main/java/org/opensearch/sql/expression/function/OpenSearchFunctions.java @@ -5,11 +5,15 @@ package org.opensearch.sql.expression.function; +import static org.opensearch.sql.data.type.ExprCoreType.BOOLEAN; + import java.util.List; import java.util.stream.Collectors; +import lombok.Getter; +import lombok.Setter; import lombok.experimental.UtilityClass; +import org.apache.commons.lang3.tuple.Pair; import org.opensearch.sql.data.model.ExprValue; -import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.FunctionExpression; @@ -32,6 +36,7 @@ public void register(BuiltinFunctionRepository repository) { repository.register(simple_query_string()); repository.register(query()); repository.register(query_string()); + // Register MATCHPHRASE as MATCH_PHRASE as well for backwards // compatibility. repository.register(match_phrase(BuiltinFunctionName.MATCH_PHRASE)); @@ -40,6 +45,11 @@ public void register(BuiltinFunctionRepository repository) { repository.register(match_phrase_prefix()); repository.register(wildcard_query(BuiltinFunctionName.WILDCARD_QUERY)); repository.register(wildcard_query(BuiltinFunctionName.WILDCARDQUERY)); + repository.register(score(BuiltinFunctionName.SCORE)); + repository.register(score(BuiltinFunctionName.SCOREQUERY)); + repository.register(score(BuiltinFunctionName.SCORE_QUERY)); + // Functions supported in SELECT clause + repository.register(nested()); } private static FunctionResolver match_bool_prefix() { @@ -86,10 +96,49 @@ private static FunctionResolver wildcard_query(BuiltinFunctionName wildcardQuery return new RelevanceFunctionResolver(funcName); } + private static FunctionResolver nested() { + return new FunctionResolver() { + @Override + public Pair resolve( + FunctionSignature unresolvedSignature) { + return Pair.of(unresolvedSignature, + (functionProperties, arguments) -> + new FunctionExpression(BuiltinFunctionName.NESTED.getName(), arguments) { + @Override + public ExprValue valueOf(Environment valueEnv) { + return valueEnv.resolve(getArguments().get(0)); + } + + @Override + public ExprType type() { + return getArguments().get(0).type(); + } + }); + } + + @Override + public FunctionName getFunctionName() { + return BuiltinFunctionName.NESTED.getName(); + } + }; + } + + + + + private static FunctionResolver score(BuiltinFunctionName score) { + FunctionName funcName = score.getName(); + return new RelevanceFunctionResolver(funcName); + } + public static class OpenSearchFunction extends FunctionExpression { private final FunctionName functionName; private final List arguments; + @Getter + @Setter + private boolean isScoreTracked; + /** * Required argument constructor. * @param functionName name of the function @@ -99,6 +148,7 @@ public OpenSearchFunction(FunctionName functionName, List arguments) super(functionName, arguments); this.functionName = functionName; this.arguments = arguments; + this.isScoreTracked = false; } @Override @@ -110,7 +160,7 @@ public ExprValue valueOf(Environment valueEnv) { @Override public ExprType type() { - return ExprCoreType.BOOLEAN; + return BOOLEAN; } @Override diff --git a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java index 43422d87336..607a5af983b 100644 --- a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java +++ b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java @@ -11,6 +11,7 @@ import org.opensearch.sql.planner.logical.LogicalEval; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanNodeVisitor; @@ -27,6 +28,7 @@ import org.opensearch.sql.planner.physical.EvalOperator; import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; +import org.opensearch.sql.planner.physical.NestedOperator; import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; @@ -96,6 +98,11 @@ public PhysicalPlan visitEval(LogicalEval node, C context) { return new EvalOperator(visitChild(node, context), node.getExpressions()); } + @Override + public PhysicalPlan visitNested(LogicalNested node, C context) { + return new NestedOperator(visitChild(node, context), node.getFields()); + } + @Override public PhysicalPlan visitSort(LogicalSort node, C context) { return new SortOperator(visitChild(node, context), node.getSortList()); diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalNested.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalNested.java new file mode 100644 index 00000000000..3e0e167cf31 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalNested.java @@ -0,0 +1,49 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.logical; + +import java.util.Collections; +import java.util.List; +import java.util.Map; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import lombok.ToString; +import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.ReferenceExpression; + +/** + * Logical Nested plan. + */ +@EqualsAndHashCode(callSuper = true) +@Getter +@ToString +public class LogicalNested extends LogicalPlan { + private List> fields; + private final List projectList; + + /** + * Constructor of LogicalNested. + * + */ + public LogicalNested( + LogicalPlan childPlan, + List> fields, + List projectList + ) { + super(Collections.singletonList(childPlan)); + this.fields = fields; + this.projectList = projectList; + } + + public void addFields(Map fields) { + this.fields.add(fields); + } + + @Override + public R accept(LogicalPlanNodeVisitor visitor, C context) { + return visitor.visitNested(this, context); + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java index c7e1ced92fc..e95e47a013c 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanDSL.java @@ -78,6 +78,14 @@ public LogicalPlan highlight(LogicalPlan input, Expression field, return new LogicalHighlight(input, field, arguments); } + + public static LogicalPlan nested( + LogicalPlan input, + List> nestedArgs, + List projectList) { + return new LogicalNested(input, nestedArgs, projectList); + } + public static LogicalPlan remove(LogicalPlan input, ReferenceExpression... fields) { return new LogicalRemove(input, ImmutableSet.copyOf(fields)); } diff --git a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java index 28cf6bcd792..b3d63e843f7 100644 --- a/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitor.java @@ -73,6 +73,10 @@ public R visitEval(LogicalEval plan, C context) { return visitNode(plan, context); } + public R visitNested(LogicalNested plan, C context) { + return visitNode(plan, context); + } + public R visitSort(LogicalSort plan, C context) { return visitNode(plan, context); } diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java index f2cd4faf17b..afe86d0cb1c 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizer.java @@ -58,6 +58,7 @@ public static LogicalPlanOptimizer create() { TableScanPushDown.PUSH_DOWN_SORT, TableScanPushDown.PUSH_DOWN_LIMIT, TableScanPushDown.PUSH_DOWN_HIGHLIGHT, + TableScanPushDown.PUSH_DOWN_NESTED, TableScanPushDown.PUSH_DOWN_PROJECT, new CreateTableWriteBuilder())); } diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java index 6e548975063..0cb540743eb 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java @@ -16,6 +16,7 @@ import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalProject; @@ -66,6 +67,13 @@ public static Pattern highlight(Patter return Pattern.typeOf(LogicalHighlight.class).with(source(pattern)); } + /** + * Logical nested operator with a given pattern on inner field. + */ + public static Pattern nested(Pattern pattern) { + return Pattern.typeOf(LogicalNested.class).with(source(pattern)); + } + /** * Logical project operator with a given pattern on inner field. */ diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/read/TableScanPushDown.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/read/TableScanPushDown.java index 556a12bb344..de2b47d4037 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/read/TableScanPushDown.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/read/TableScanPushDown.java @@ -9,6 +9,7 @@ import static org.opensearch.sql.planner.optimizer.pattern.Patterns.filter; import static org.opensearch.sql.planner.optimizer.pattern.Patterns.highlight; import static org.opensearch.sql.planner.optimizer.pattern.Patterns.limit; +import static org.opensearch.sql.planner.optimizer.pattern.Patterns.nested; import static org.opensearch.sql.planner.optimizer.pattern.Patterns.project; import static org.opensearch.sql.planner.optimizer.pattern.Patterns.scanBuilder; import static org.opensearch.sql.planner.optimizer.pattern.Patterns.sort; @@ -74,6 +75,11 @@ public class TableScanPushDown implements Rule { scanBuilder())) .apply((highlight, scanBuilder) -> scanBuilder.pushDownHighlight(highlight)); + public static final Rule PUSH_DOWN_NESTED = + match( + nested( + scanBuilder())) + .apply((nested, scanBuilder) -> scanBuilder.pushDownNested(nested)); /** Pattern that matches a plan node. */ private final WithPattern pattern; diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/NestedOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/NestedOperator.java new file mode 100644 index 00000000000..cea8ae6c141 --- /dev/null +++ b/core/src/main/java/org/opensearch/sql/planner/physical/NestedOperator.java @@ -0,0 +1,291 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.physical; + +import static java.util.stream.Collectors.mapping; +import static java.util.stream.Collectors.toList; + +import java.util.ArrayList; +import java.util.Collections; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.ListIterator; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.Stream; +import lombok.EqualsAndHashCode; +import lombok.Getter; +import org.apache.commons.lang3.StringUtils; +import org.opensearch.sql.data.model.ExprCollectionValue; +import org.opensearch.sql.data.model.ExprNullValue; +import org.opensearch.sql.data.model.ExprTupleValue; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.expression.ReferenceExpression; + +/** + * The NestedOperator evaluates the {@link NestedOperator#fields} and + * generates {@link NestedOperator#nonNestedFields} to form the + * {@link NestedOperator#result} output. Resolve two nested fields + * with differing paths will result in a cartesian product(inner join). + */ +@EqualsAndHashCode(callSuper = false) +public class NestedOperator extends PhysicalPlan { + @Getter + private final PhysicalPlan input; + @Getter + private final Set fields; // Needs to be a Set to match legacy implementation + @Getter + private final Map> groupedPathsAndFields; + @EqualsAndHashCode.Exclude + private List> result = new ArrayList<>(); + @EqualsAndHashCode.Exclude + private List nonNestedFields = new ArrayList<>(); + @EqualsAndHashCode.Exclude + private ListIterator> flattenedResult = result.listIterator(); + + private long totalHits = 0; + + /** + * Constructor for NestedOperator with list of map as arg. + * @param input : PhysicalPlan input. + * @param fields : List of all fields and paths for nested fields. + */ + public NestedOperator(PhysicalPlan input, List> fields) { + this.input = input; + this.fields = fields.stream() + .map(m -> m.get("field").toString()) + .collect(Collectors.toSet()); + this.groupedPathsAndFields = fields.stream().collect( + Collectors.groupingBy( + m -> m.get("path").toString(), + mapping( + m -> m.get("field").toString(), + toList() + ) + ) + ); + } + + /** + * Constructor for NestedOperator with Set of fields. + * @param input : PhysicalPlan input. + * @param fields : List of all fields for nested fields. + * @param groupedPathsAndFields : Map of fields grouped by their path. + */ + public NestedOperator( + PhysicalPlan input, + Set fields, + Map> groupedPathsAndFields + ) { + this.input = input; + this.fields = fields; + this.groupedPathsAndFields = groupedPathsAndFields; + } + + @Override + public R accept(PhysicalPlanNodeVisitor visitor, C context) { + return visitor.visitNested(this, context); + } + + @Override + public List getChild() { + return Collections.singletonList(input); + } + + @Override + public boolean hasNext() { + return input.hasNext() || flattenedResult.hasNext(); + } + + @Override + public ExprValue next() { + if (!flattenedResult.hasNext()) { + result.clear(); + nonNestedFields.clear(); + + ExprValue inputValue = input.next(); + generateNonNestedFieldsMap(inputValue); + // Add all nested fields to result map + for (String field : fields) { + result = flatten(field, inputValue, result); + } + + // Add all non-nested fields to result map + for (String nonNestedField : nonNestedFields) { + result = flatten(nonNestedField, inputValue, result); + } + + if (result.isEmpty()) { + flattenedResult = result.listIterator(); + totalHits++; + return new ExprTupleValue(new LinkedHashMap<>()); + } + + flattenedResult = result.listIterator(); + } + totalHits++; + return new ExprTupleValue(new LinkedHashMap<>(flattenedResult.next())); + } + + /** + * Generate list of non-nested fields that are in inputMap, but not in the member variable + * fields list. + * @param inputMap : Row to parse non-nested fields. + */ + public void generateNonNestedFieldsMap(ExprValue inputMap) { + for (Map.Entry inputField : inputMap.tupleValue().entrySet()) { + boolean foundNestedField = + this.fields.stream().anyMatch( + field -> field.split("\\.")[0].equalsIgnoreCase(inputField.getKey()) + ); + + if (!foundNestedField) { + this.nonNestedFields.add(inputField.getKey()); + } + } + } + + + /** + * Simplifies the structure of row's source Map by flattening it, + * making the full path of an object the key + * and the Object it refers to the value. + * + *

Sample input: + * keys = ['comments.likes'] + * row = comments: { + * likes: 2 + * } + * + *

Return: + * flattenedRow = {comment.likes: 2} + * + * @param nestedField : Field to query in row. + * @param row : Row returned from OS. + * @param prevList : List of previous nested calls. + * @return : List of nested select items or cartesian product of nested calls. + */ + private List> flatten( + String nestedField, + ExprValue row, + List> prevList + ) { + List> copy = new ArrayList<>(); + List> newList = new ArrayList<>(); + + ExprValue nestedObj = null; + getNested(nestedField, nestedField, row, copy, nestedObj); + + // Only one field in select statement + if (prevList.size() == 0) { + return copy; + } + + if (containSamePath(copy.get(0))) { + var resultIt = this.result.iterator(); + Map resultVal = resultIt.next(); + var copyIt = copy.iterator(); + Map copyVal = copyIt.next(); + for (int i = 0; i < this.result.size(); i++) { + resultVal.putAll(copyVal); + if (copyIt.hasNext()) { + copyVal = copyIt.next(); + } + if (resultIt.hasNext()) { + resultVal = resultIt.next(); + } + } + return this.result; + } else { + // Generate cartesian product + for (Map prevMap : prevList) { + for (Map newMap : copy) { + newList.add(Stream.of(newMap, prevMap) + .flatMap(map -> map.entrySet().stream()) + .collect(Collectors.toMap( + Map.Entry::getKey, + Map.Entry::getValue))); + } + } + return newList; + } + } + + /** + * Check if newMap field has any sharing paths in prevMap. + * @param newMap : New map to add to result set. + * @return : true if there is already a field added to result set with same path. + */ + boolean containSamePath(Map newMap) { + String newKey = newMap.keySet().iterator().next(); + Map resultMap = this.result.iterator().next(); + for (var entry : this.groupedPathsAndFields.entrySet()) { + if (entry.getValue().contains(newKey)) { + for (var map : resultMap.entrySet()) { + if (entry.getValue().contains(map.getKey())) { + return true; + } + } + } + } + return false; + } + + /** + * Retrieve nested field(s) in row. + * + * @param field : Path for nested field. + * @param nestedField : Current level to nested field path. + * @param row : Row to resolve nested field. + * @param ret : List to add nested field to. + * @param nestedObj : Object at current nested level. + * @return : Object at current nested level. + */ + private void getNested( + String field, String nestedField, ExprValue row, + List> ret, ExprValue nestedObj + ) { + ExprValue currentObj = (nestedObj == null) ? row : nestedObj; + String[] splitKeys = nestedField.split("\\."); + + if (currentObj instanceof ExprTupleValue) { + ExprTupleValue currentMap = (ExprTupleValue) currentObj; + if (currentMap.tupleValue().containsKey(splitKeys[0])) { + currentObj = currentMap.tupleValue().get(splitKeys[0]); + } else { + currentObj = null; + ret.add(new LinkedHashMap<>(Map.of(field, ExprNullValue.of()))); + } + } else if (currentObj instanceof ExprCollectionValue) { + ExprValue arrayObj = currentObj; + for (int x = 0; x < arrayObj.collectionValue().size(); x++) { + currentObj = arrayObj.collectionValue().get(x); + getNested(field, nestedField, row, ret, currentObj); + currentObj = null; + } + } else { + currentObj = null; + } + + // Return final nested result + if (currentObj != null + && (StringUtils.substringAfterLast(field, ".").equals(nestedField) + || !field.contains(".")) + ) { + ret.add(new LinkedHashMap<>(Map.of(field, currentObj))); + } else if (currentObj != null) { + getNested(field, nestedField.substring(nestedField.indexOf(".") + 1), + row, ret, currentObj); + } + } + + @Override + public long getTotalHits() { + return totalHits; + } +} diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanDSL.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanDSL.java index e6e59990c82..8c10c91fb69 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanDSL.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanDSL.java @@ -11,6 +11,7 @@ import java.util.Arrays; import java.util.List; import java.util.Map; +import java.util.Set; import lombok.experimental.UtilityClass; import org.apache.commons.lang3.tuple.Pair; import org.opensearch.sql.ast.tree.RareTopN.CommandType; @@ -105,4 +106,11 @@ public ValuesOperator values(List... values) { public static LimitOperator limit(PhysicalPlan input, Integer limit, Integer offset) { return new LimitOperator(input, limit, offset); } + + public static NestedOperator nested( + PhysicalPlan input, + Set args, + Map> groupedFieldsByPath) { + return new NestedOperator(input, args, groupedFieldsByPath); + } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java index f8b6f2243e0..bc4c0404c46 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java @@ -57,6 +57,10 @@ public R visitEval(EvalOperator node, C context) { return visitNode(node, context); } + public R visitNested(NestedOperator node, C context) { + return visitNode(node, context); + } + public R visitDedupe(DedupeOperator node, C context) { return visitNode(node, context); } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScan.java b/core/src/main/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScan.java index 60969f4d547..93e65054b5a 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScan.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScan.java @@ -48,7 +48,7 @@ public String explain() { public void open() { List exprValues = new ArrayList<>(); Set dataSourceMetadataSet - = dataSourceService.getDataSourceMetadataSet(); + = dataSourceService.getDataSourceMetadata(true); for (DataSourceMetadata dataSourceMetadata : dataSourceMetadataSet) { exprValues.add( new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( diff --git a/core/src/main/java/org/opensearch/sql/storage/DataSourceFactory.java b/core/src/main/java/org/opensearch/sql/storage/DataSourceFactory.java index 20d263e601f..8512eddbe33 100644 --- a/core/src/main/java/org/opensearch/sql/storage/DataSourceFactory.java +++ b/core/src/main/java/org/opensearch/sql/storage/DataSourceFactory.java @@ -28,4 +28,5 @@ public interface DataSourceFactory { * Create {@link DataSource}. */ DataSource createDataSource(DataSourceMetadata metadata); + } diff --git a/core/src/main/java/org/opensearch/sql/storage/Table.java b/core/src/main/java/org/opensearch/sql/storage/Table.java index a7f2b606ca9..0194f1d03e7 100644 --- a/core/src/main/java/org/opensearch/sql/storage/Table.java +++ b/core/src/main/java/org/opensearch/sql/storage/Table.java @@ -43,6 +43,13 @@ default void create(Map schema) { */ Map getFieldTypes(); + /** + * Get the {@link ExprType} for each meta-field (reserved fields) in the table. + */ + default Map getReservedFieldTypes() { + return Map.of(); + } + /** * Implement a {@link LogicalPlan} by {@link PhysicalPlan} in storage engine. * diff --git a/core/src/main/java/org/opensearch/sql/storage/read/TableScanBuilder.java b/core/src/main/java/org/opensearch/sql/storage/read/TableScanBuilder.java index c0fdf36e709..9af66e219fa 100644 --- a/core/src/main/java/org/opensearch/sql/storage/read/TableScanBuilder.java +++ b/core/src/main/java/org/opensearch/sql/storage/read/TableScanBuilder.java @@ -10,6 +10,7 @@ import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanNodeVisitor; import org.opensearch.sql.planner.logical.LogicalProject; @@ -104,6 +105,17 @@ public boolean pushDownHighlight(LogicalHighlight highlight) { return false; } + /** + * Can a given nested operator be pushed down to table scan builder. Assume no such support + * by default unless subclass override this. + * + * @param nested logical nested operator + * @return true if pushed down, otherwise false + */ + public boolean pushDownNested(LogicalNested nested) { + return false; + } + @Override public R accept(LogicalPlanNodeVisitor visitor, C context) { return visitor.visitTableScanBuilder(this, context); diff --git a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java index 01e2091da93..b597bbcd829 100644 --- a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java +++ b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTest.java @@ -31,6 +31,7 @@ import static org.opensearch.sql.ast.tree.Sort.SortOption.DEFAULT_ASC; import static org.opensearch.sql.ast.tree.Sort.SortOrder; import static org.opensearch.sql.data.model.ExprValueUtils.integerValue; +import static org.opensearch.sql.data.model.ExprValueUtils.stringValue; import static org.opensearch.sql.data.type.ExprCoreType.BOOLEAN; import static org.opensearch.sql.data.type.ExprCoreType.DOUBLE; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; @@ -71,18 +72,24 @@ import org.opensearch.sql.ast.expression.HighlightFunction; import org.opensearch.sql.ast.expression.Literal; import org.opensearch.sql.ast.expression.ParseMethod; +import org.opensearch.sql.ast.expression.ScoreFunction; import org.opensearch.sql.ast.expression.SpanUnit; import org.opensearch.sql.ast.tree.AD; import org.opensearch.sql.ast.tree.Kmeans; import org.opensearch.sql.ast.tree.ML; import org.opensearch.sql.ast.tree.Paginate; import org.opensearch.sql.ast.tree.RareTopN.CommandType; +import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.exception.ExpressionEvaluationException; import org.opensearch.sql.exception.SemanticCheckException; import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.HighlightExpression; +import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.function.OpenSearchFunctions; import org.opensearch.sql.expression.window.WindowDefinition; import org.opensearch.sql.planner.logical.LogicalAD; +import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalMLCommons; import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; @@ -104,6 +111,54 @@ public void filter_relation() { AstDSL.equalTo(AstDSL.field("integer_value"), AstDSL.intLiteral(1)))); } + @Test + public void filter_relation_with_reserved_qualifiedName() { + assertAnalyzeEqual( + LogicalPlanDSL.filter( + LogicalPlanDSL.relation("schema", table), + DSL.equal(DSL.ref("_test", STRING), DSL.literal(stringValue("value")))), + AstDSL.filter( + AstDSL.relation("schema"), + AstDSL.equalTo(AstDSL.qualifiedName("_test"), AstDSL.stringLiteral("value")))); + } + + @Test + public void filter_relation_with_invalid_qualifiedName_SemanticCheckException() { + UnresolvedPlan invalidFieldPlan = AstDSL.filter( + AstDSL.relation("schema"), + AstDSL.equalTo( + AstDSL.qualifiedName("_invalid"), + AstDSL.stringLiteral("value")) + ); + + SemanticCheckException exception = + assertThrows( + SemanticCheckException.class, + () -> analyze(invalidFieldPlan)); + assertEquals( + "can't resolve Symbol(namespace=FIELD_NAME, name=_invalid) in type env", + exception.getMessage()); + } + + @Test + public void filter_relation_with_invalid_qualifiedName_ExpressionEvaluationException() { + UnresolvedPlan typeMismatchPlan = AstDSL.filter( + AstDSL.relation("schema"), + AstDSL.equalTo(AstDSL.qualifiedName("_test"), AstDSL.intLiteral(1)) + ); + + ExpressionEvaluationException exception = + assertThrows( + ExpressionEvaluationException.class, + () -> analyze(typeMismatchPlan)); + assertEquals( + "= function expected {[BYTE,BYTE],[SHORT,SHORT],[INTEGER,INTEGER],[LONG,LONG]," + + "[FLOAT,FLOAT],[DOUBLE,DOUBLE],[STRING,STRING],[BOOLEAN,BOOLEAN],[DATE,DATE]," + + "[TIME,TIME],[DATETIME,DATETIME],[TIMESTAMP,TIMESTAMP],[INTERVAL,INTERVAL]," + + "[STRUCT,STRUCT],[ARRAY,ARRAY]}, but get [STRING,INTEGER]", + exception.getMessage()); + } + @Test public void filter_relation_with_alias() { assertAnalyzeEqual( @@ -216,6 +271,116 @@ public void filter_relation_with_multiple_tables() { AstDSL.equalTo(AstDSL.field("integer_value"), AstDSL.intLiteral(1)))); } + @Test + public void analyze_filter_visit_score_function() { + UnresolvedPlan unresolvedPlan = AstDSL.filter( + AstDSL.relation("schema"), + new ScoreFunction( + AstDSL.function("match_phrase_prefix", + AstDSL.unresolvedArg("field", stringLiteral("field_value1")), + AstDSL.unresolvedArg("query", stringLiteral("search query")), + AstDSL.unresolvedArg("boost", stringLiteral("3")) + ), AstDSL.doubleLiteral(1.0)) + ); + assertAnalyzeEqual( + LogicalPlanDSL.filter( + LogicalPlanDSL.relation("schema", table), + DSL.match_phrase_prefix( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("boost", "3.0") + ) + ), + unresolvedPlan + ); + + LogicalPlan logicalPlan = analyze(unresolvedPlan); + OpenSearchFunctions.OpenSearchFunction relevanceQuery = + (OpenSearchFunctions.OpenSearchFunction)((LogicalFilter) logicalPlan).getCondition(); + assertEquals(true, relevanceQuery.isScoreTracked()); + } + + @Test + public void analyze_filter_visit_without_score_function() { + UnresolvedPlan unresolvedPlan = AstDSL.filter( + AstDSL.relation("schema"), + AstDSL.function("match_phrase_prefix", + AstDSL.unresolvedArg("field", stringLiteral("field_value1")), + AstDSL.unresolvedArg("query", stringLiteral("search query")), + AstDSL.unresolvedArg("boost", stringLiteral("3")) + ) + ); + assertAnalyzeEqual( + LogicalPlanDSL.filter( + LogicalPlanDSL.relation("schema", table), + DSL.match_phrase_prefix( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("boost", "3") + ) + ), + unresolvedPlan + ); + + LogicalPlan logicalPlan = analyze(unresolvedPlan); + OpenSearchFunctions.OpenSearchFunction relevanceQuery = + (OpenSearchFunctions.OpenSearchFunction)((LogicalFilter) logicalPlan).getCondition(); + assertEquals(false, relevanceQuery.isScoreTracked()); + } + + @Test + public void analyze_filter_visit_score_function_with_double_boost() { + UnresolvedPlan unresolvedPlan = AstDSL.filter( + AstDSL.relation("schema"), + new ScoreFunction( + AstDSL.function("match_phrase_prefix", + AstDSL.unresolvedArg("field", stringLiteral("field_value1")), + AstDSL.unresolvedArg("query", stringLiteral("search query")), + AstDSL.unresolvedArg("slop", stringLiteral("3")) + ), new Literal(3.0, DataType.DOUBLE) + ) + ); + + assertAnalyzeEqual( + LogicalPlanDSL.filter( + LogicalPlanDSL.relation("schema", table), + DSL.match_phrase_prefix( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("slop", "3"), + DSL.namedArgument("boost", "3.0") + ) + ), + unresolvedPlan + ); + + LogicalPlan logicalPlan = analyze(unresolvedPlan); + OpenSearchFunctions.OpenSearchFunction relevanceQuery = + (OpenSearchFunctions.OpenSearchFunction)((LogicalFilter) logicalPlan).getCondition(); + assertEquals(true, relevanceQuery.isScoreTracked()); + } + + @Test + public void analyze_filter_visit_score_function_with_unsupported_boost_SemanticCheckException() { + UnresolvedPlan unresolvedPlan = AstDSL.filter( + AstDSL.relation("schema"), + new ScoreFunction( + AstDSL.function("match_phrase_prefix", + AstDSL.unresolvedArg("field", stringLiteral("field_value1")), + AstDSL.unresolvedArg("query", stringLiteral("search query")), + AstDSL.unresolvedArg("boost", stringLiteral("3")) + ), AstDSL.stringLiteral("3.0") + ) + ); + SemanticCheckException exception = + assertThrows( + SemanticCheckException.class, + () -> analyze(unresolvedPlan)); + assertEquals( + "Expected boost type 'DOUBLE' but got 'STRING'", + exception.getMessage()); + } + @Test public void head_relation() { assertAnalyzeEqual( @@ -369,6 +534,236 @@ public void project_source() { AstDSL.alias("double_value", AstDSL.field("double_value")))); } + @Test + public void project_nested_field_arg() { + List> nestedArgs = + List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression( + "message.info", + DSL.nested(DSL.ref("message.info", STRING)), + null) + ); + + assertAnalyzeEqual( + LogicalPlanDSL.project( + LogicalPlanDSL.nested( + LogicalPlanDSL.relation("schema", table), + nestedArgs, + projectList), + DSL.named("message.info", + DSL.nested(DSL.ref("message.info", STRING))) + ), + AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message.info", + function("nested", qualifiedName("message", "info")), null) + ) + ); + } + + @Test + public void project_nested_field_and_path_args() { + List> nestedArgs = + List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression( + "message.info", + DSL.nested(DSL.ref("message.info", STRING), DSL.ref("message", STRING)), + null) + ); + + assertAnalyzeEqual( + LogicalPlanDSL.project( + LogicalPlanDSL.nested( + LogicalPlanDSL.relation("schema", table), + nestedArgs, + projectList), + DSL.named("message.info", + DSL.nested(DSL.ref("message.info", STRING), DSL.ref("message", STRING))) + ), + AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message.info", + function( + "nested", + qualifiedName("message", "info"), + qualifiedName("message") + ), + null + ) + ) + ); + } + + @Test + public void project_nested_deep_field_arg() { + List> nestedArgs = + List.of( + Map.of( + "field", new ReferenceExpression("message.info.id", STRING), + "path", new ReferenceExpression("message.info", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression( + "message.info.id", + DSL.nested(DSL.ref("message.info.id", STRING)), + null) + ); + + assertAnalyzeEqual( + LogicalPlanDSL.project( + LogicalPlanDSL.nested( + LogicalPlanDSL.relation("schema", table), + nestedArgs, + projectList), + DSL.named("message.info.id", + DSL.nested(DSL.ref("message.info.id", STRING))) + ), + AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message.info.id", + function("nested", qualifiedName("message", "info", "id")), null) + ) + ); + } + + @Test + public void project_multiple_nested() { + List> nestedArgs = + List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ), + Map.of( + "field", new ReferenceExpression("comment.data", STRING), + "path", new ReferenceExpression("comment", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression( + "message.info", + DSL.nested(DSL.ref("message.info", STRING)), + null), + new NamedExpression( + "comment.data", + DSL.nested(DSL.ref("comment.data", STRING)), + null) + ); + + assertAnalyzeEqual( + LogicalPlanDSL.project( + LogicalPlanDSL.nested( + LogicalPlanDSL.relation("schema", table), + nestedArgs, + projectList), + DSL.named("message.info", + DSL.nested(DSL.ref("message.info", STRING))), + DSL.named("comment.data", + DSL.nested(DSL.ref("comment.data", STRING))) + ), + AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message.info", + function("nested", qualifiedName("message", "info")), null), + AstDSL.alias("comment.data", + function("nested", qualifiedName("comment", "data")), null) + ) + ); + } + + @Test + public void project_nested_invalid_field_throws_exception() { + var exception = assertThrows( + IllegalArgumentException.class, + () -> analyze(AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message", + function("nested", qualifiedName("message")), null) + ) + ) + ); + assertEquals(exception.getMessage(), "Illegal nested field name: message"); + } + + @Test + public void project_nested_invalid_arg_type_throws_exception() { + var exception = assertThrows( + IllegalArgumentException.class, + () -> analyze(AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message", + function("nested", stringLiteral("message")), null) + ) + ) + ); + assertEquals(exception.getMessage(), "Illegal nested field name: message"); + } + + @Test + public void project_nested_no_args_throws_exception() { + var exception = assertThrows( + IllegalArgumentException.class, + () -> analyze(AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message", + function("nested"), null) + ) + ) + ); + assertEquals(exception.getMessage(), + "on nested object only allowed 2 parameters (field,path) or 1 parameter (field)" + ); + } + + @Test + public void project_nested_too_many_args_throws_exception() { + var exception = assertThrows( + IllegalArgumentException.class, + () -> analyze(AstDSL.projectWithArg( + AstDSL.relation("schema"), + AstDSL.defaultFieldsArgs(), + AstDSL.alias("message", + function("nested", + stringLiteral("message.info"), + stringLiteral("message"), + stringLiteral("message")), + null) + ) + ) + ); + assertEquals(exception.getMessage(), + "on nested object only allowed 2 parameters (field,path) or 1 parameter (field)" + ); + } + @Test public void project_highlight() { Map args = new HashMap<>(); diff --git a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTestBase.java b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTestBase.java index a040e2a53fe..d7222d466f3 100644 --- a/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTestBase.java +++ b/core/src/test/java/org/opensearch/sql/analysis/AnalyzerTestBase.java @@ -112,6 +112,10 @@ public Map getFieldTypes() { public PhysicalPlan implement(LogicalPlan plan) { throw new UnsupportedOperationException(); } + + public Map getReservedFieldTypes() { + return ImmutableMap.of("_test", STRING); + } }); } @@ -185,13 +189,18 @@ private class DefaultDataSourceService implements DataSourceService { @Override - public Set getDataSourceMetadataSet() { + public Set getDataSourceMetadata(boolean isDefaultDataSourceRequired) { return Stream.of(opensearchDataSource, prometheusDataSource) .map(ds -> new DataSourceMetadata(ds.getName(), ds.getConnectorType(),Collections.emptyList(), ImmutableMap.of())).collect(Collectors.toSet()); } + @Override + public DataSourceMetadata getDataSourceMetadata(String name) { + return null; + } + @Override public void createDataSource(DataSourceMetadata metadata) { throw new UnsupportedOperationException("unsupported operation"); diff --git a/core/src/test/java/org/opensearch/sql/analysis/ExpressionAnalyzerTest.java b/core/src/test/java/org/opensearch/sql/analysis/ExpressionAnalyzerTest.java index c7a11658e38..5a05c79132e 100644 --- a/core/src/test/java/org/opensearch/sql/analysis/ExpressionAnalyzerTest.java +++ b/core/src/test/java/org/opensearch/sql/analysis/ExpressionAnalyzerTest.java @@ -20,7 +20,9 @@ import static org.opensearch.sql.data.model.ExprValueUtils.LITERAL_TRUE; import static org.opensearch.sql.data.model.ExprValueUtils.integerValue; import static org.opensearch.sql.data.type.ExprCoreType.BOOLEAN; +import static org.opensearch.sql.data.type.ExprCoreType.FLOAT; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; +import static org.opensearch.sql.data.type.ExprCoreType.LONG; import static org.opensearch.sql.data.type.ExprCoreType.STRING; import static org.opensearch.sql.data.type.ExprCoreType.STRUCT; import static org.opensearch.sql.expression.DSL.ref; @@ -228,6 +230,32 @@ public void qualified_name_with_qualifier() { analysisContext.pop(); } + @Test + public void qualified_name_with_reserved_symbol() { + analysisContext.push(); + + analysisContext.peek().addReservedWord(new Symbol(Namespace.FIELD_NAME, "_reserved"), STRING); + analysisContext.peek().addReservedWord(new Symbol(Namespace.FIELD_NAME, "_priority"), FLOAT); + analysisContext.peek().define(new Symbol(Namespace.INDEX_NAME, "index_alias"), STRUCT); + assertAnalyzeEqual( + DSL.ref("_priority", FLOAT), + qualifiedName("_priority") + ); + assertAnalyzeEqual( + DSL.ref("_reserved", STRING), + qualifiedName("index_alias", "_reserved") + ); + + // reserved fields take priority over symbol table + analysisContext.peek().define(new Symbol(Namespace.FIELD_NAME, "_reserved"), LONG); + assertAnalyzeEqual( + DSL.ref("_reserved", STRING), + qualifiedName("index_alias", "_reserved") + ); + + analysisContext.pop(); + } + @Test public void interval() { assertAnalyzeEqual( @@ -259,17 +287,6 @@ public void case_clause() { AstDSL.stringLiteral("test")))); } - @Test - public void skip_array_data_type() { - SyntaxCheckException exception = - assertThrows(SyntaxCheckException.class, - () -> analyze(qualifiedName("array_value"))); - assertEquals( - "Identifier [array_value] of type [ARRAY] is not supported yet", - exception.getMessage() - ); - } - @Test public void undefined_var_semantic_check_failed() { SemanticCheckException exception = assertThrows(SemanticCheckException.class, @@ -600,6 +617,142 @@ public void match_phrase_prefix_all_params() { ); } + @Test void score_function_expression() { + assertAnalyzeEqual( + DSL.score( + DSL.namedArgument("RelevanceQuery", + DSL.match_phrase_prefix( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("slop", "3") + ) + )), + AstDSL.function("score", + unresolvedArg("RelevanceQuery", + AstDSL.function("match_phrase_prefix", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")), + unresolvedArg("slop", stringLiteral("3")) + ) + ) + ) + ); + } + + @Test void score_function_with_boost() { + assertAnalyzeEqual( + DSL.score( + DSL.namedArgument("RelevanceQuery", + DSL.match_phrase_prefix( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("boost", "3.0") + )), + DSL.namedArgument("boost", "2") + ), + AstDSL.function("score", + unresolvedArg("RelevanceQuery", + AstDSL.function("match_phrase_prefix", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")), + unresolvedArg("boost", stringLiteral("3.0")) + ) + ), + unresolvedArg("boost", stringLiteral("2")) + ) + ); + } + + @Test void score_query_function_expression() { + assertAnalyzeEqual( + DSL.score_query( + DSL.namedArgument("RelevanceQuery", + DSL.wildcard_query( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query") + ) + )), + AstDSL.function("score_query", + unresolvedArg("RelevanceQuery", + AstDSL.function("wildcard_query", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")) + ) + ) + ) + ); + } + + @Test void score_query_function_with_boost() { + assertAnalyzeEqual( + DSL.score_query( + DSL.namedArgument("RelevanceQuery", + DSL.wildcard_query( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query") + ) + ), + DSL.namedArgument("boost", "2.0") + ), + AstDSL.function("score_query", + unresolvedArg("RelevanceQuery", + AstDSL.function("wildcard_query", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")) + ) + ), + unresolvedArg("boost", stringLiteral("2.0")) + ) + ); + } + + @Test void scorequery_function_expression() { + assertAnalyzeEqual( + DSL.scorequery( + DSL.namedArgument("RelevanceQuery", + DSL.simple_query_string( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("slop", "3") + ) + )), + AstDSL.function("scorequery", + unresolvedArg("RelevanceQuery", + AstDSL.function("simple_query_string", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")), + unresolvedArg("slop", stringLiteral("3")) + ) + ) + ) + ); + } + + @Test + void scorequery_function_with_boost() { + assertAnalyzeEqual( + DSL.scorequery( + DSL.namedArgument("RelevanceQuery", + DSL.simple_query_string( + DSL.namedArgument("field", "field_value1"), + DSL.namedArgument("query", "search query"), + DSL.namedArgument("slop", "3") + )), + DSL.namedArgument("boost", "2.0") + ), + AstDSL.function("scorequery", + unresolvedArg("RelevanceQuery", + AstDSL.function("simple_query_string", + unresolvedArg("field", stringLiteral("field_value1")), + unresolvedArg("query", stringLiteral("search query")), + unresolvedArg("slop", stringLiteral("3")) + ) + ), + unresolvedArg("boost", stringLiteral("2.0")) + ) + ); + } + @Test public void function_isnt_calculated_on_analyze() { assertTrue(analyze(function("now")) instanceof FunctionExpression); diff --git a/core/src/test/java/org/opensearch/sql/config/TestConfig.java b/core/src/test/java/org/opensearch/sql/config/TestConfig.java index 74dde6c2e91..6179f020c29 100644 --- a/core/src/test/java/org/opensearch/sql/config/TestConfig.java +++ b/core/src/test/java/org/opensearch/sql/config/TestConfig.java @@ -56,6 +56,11 @@ public class TestConfig { .put("timestamp_value", ExprCoreType.TIMESTAMP) .put("field_value1", ExprCoreType.STRING) .put("field_value2", ExprCoreType.STRING) + .put("message", ExprCoreType.STRING) + .put("message.info", ExprCoreType.STRING) + .put("message.info.id", ExprCoreType.STRING) + .put("comment", ExprCoreType.STRING) + .put("comment.data", ExprCoreType.STRING) .build(); protected StorageEngine storageEngine() { diff --git a/core/src/test/java/org/opensearch/sql/executor/ExplainTest.java b/core/src/test/java/org/opensearch/sql/executor/ExplainTest.java index c2763e71201..7d438c870d6 100644 --- a/core/src/test/java/org/opensearch/sql/executor/ExplainTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/ExplainTest.java @@ -22,6 +22,7 @@ import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.eval; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.filter; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.limit; +import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.nested; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.project; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.rareTopN; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.remove; @@ -30,10 +31,9 @@ import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.values; import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.window; -import com.google.common.collect.ImmutableList; -import com.google.common.collect.ImmutableMap; import java.util.List; import java.util.Map; +import java.util.Set; import org.apache.commons.lang3.tuple.ImmutablePair; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.DisplayNameGeneration; @@ -83,20 +83,20 @@ void can_explain_project_filter_table_scan() { new ExplainResponse( new ExplainResponseNode( "ProjectOperator", - ImmutableMap.of("fields", "[name, age]"), + Map.of("fields", "[name, age]"), singletonList(new ExplainResponseNode( "FilterOperator", - ImmutableMap.of("conditions", "and(=(balance, 10000), >(age, 30))"), + Map.of("conditions", "and(=(balance, 10000), >(age, 30))"), singletonList(tableScan.explainNode()))))), explain.apply(plan)); } @Test void can_explain_aggregations() { - List aggExprs = ImmutableList.of(ref("balance", DOUBLE)); - List aggList = ImmutableList.of( + List aggExprs = List.of(ref("balance", DOUBLE)); + List aggList = List.of( named("avg(balance)", DSL.avg(aggExprs.toArray(new Expression[0])))); - List groupByList = ImmutableList.of( + List groupByList = List.of( named("state", ref("state", STRING))); PhysicalPlan plan = agg(new FakeTableScan(), aggList, groupByList); @@ -104,7 +104,7 @@ void can_explain_aggregations() { new ExplainResponse( new ExplainResponseNode( "AggregationOperator", - ImmutableMap.of( + Map.of( "aggregators", "[avg(balance)]", "groupBy", "[state]"), singletonList(tableScan.explainNode()))), @@ -120,7 +120,7 @@ void can_explain_rare_top_n() { new ExplainResponse( new ExplainResponseNode( "RareTopNOperator", - ImmutableMap.of( + Map.of( "commandType", TOP, "noOfResults", 10, "fields", "[state]", @@ -131,8 +131,8 @@ void can_explain_rare_top_n() { @Test void can_explain_window() { - List partitionByList = ImmutableList.of(DSL.ref("state", STRING)); - List> sortList = ImmutableList.of( + List partitionByList = List.of(DSL.ref("state", STRING)); + List> sortList = List.of( ImmutablePair.of(DEFAULT_ASC, ref("age", INTEGER))); PhysicalPlan plan = window(tableScan, named(DSL.rank()), @@ -142,12 +142,12 @@ void can_explain_window() { new ExplainResponse( new ExplainResponseNode( "WindowOperator", - ImmutableMap.of( + Map.of( "function", "rank()", - "definition", ImmutableMap.of( + "definition", Map.of( "partitionBy", "[state]", - "sortList", ImmutableMap.of( - "age", ImmutableMap.of( + "sortList", Map.of( + "age", Map.of( "sortOrder", "ASC", "nullOrder", "NULL_FIRST")))), singletonList(tableScan.explainNode()))), @@ -157,14 +157,14 @@ void can_explain_window() { @Test void can_explain_other_operators() { ReferenceExpression[] removeList = {ref("state", STRING)}; - Map renameMapping = ImmutableMap.of( + Map renameMapping = Map.of( ref("state", STRING), ref("s", STRING)); Pair evalExprs = ImmutablePair.of( ref("age", INTEGER), DSL.add(ref("age", INTEGER), literal(2))); Expression[] dedupeList = {ref("age", INTEGER)}; Pair sortList = ImmutablePair.of( DEFAULT_ASC, ref("age", INTEGER)); - List values = ImmutableList.of(literal("WA"), literal(30)); + List values = List.of(literal("WA"), literal(30)); PhysicalPlan plan = remove( @@ -183,30 +183,30 @@ void can_explain_other_operators() { new ExplainResponse( new ExplainResponseNode( "RemoveOperator", - ImmutableMap.of("removeList", "[state]"), + Map.of("removeList", "[state]"), singletonList(new ExplainResponseNode( "RenameOperator", - ImmutableMap.of("mapping", ImmutableMap.of("state", "s")), + Map.of("mapping", Map.of("state", "s")), singletonList(new ExplainResponseNode( "EvalOperator", - ImmutableMap.of("expressions", ImmutableMap.of("age", "+(age, 2)")), + Map.of("expressions", Map.of("age", "+(age, 2)")), singletonList(new ExplainResponseNode( "DedupeOperator", - ImmutableMap.of( + Map.of( "dedupeList", "[age]", "allowedDuplication", 1, "keepEmpty", false, "consecutive", false), singletonList(new ExplainResponseNode( "SortOperator", - ImmutableMap.of( - "sortList", ImmutableMap.of( - "age", ImmutableMap.of( + Map.of( + "sortList", Map.of( + "age", Map.of( "sortOrder", "ASC", "nullOrder", "NULL_FIRST"))), singletonList(new ExplainResponseNode( "ValuesOperator", - ImmutableMap.of("values", ImmutableList.of(values)), + Map.of("values", List.of(values)), emptyList()))))))))))) ), explain.apply(plan) @@ -220,7 +220,24 @@ void can_explain_limit() { new ExplainResponse( new ExplainResponseNode( "LimitOperator", - ImmutableMap.of("limit", 10, "offset", 5), + Map.of("limit", 10, "offset", 5), + singletonList(tableScan.explainNode()))), + explain.apply(plan) + ); + } + + @Test + void can_explain_nested() { + Set nestedOperatorArgs = Set.of("message.info", "message"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.info")); + PhysicalPlan plan = nested(tableScan, nestedOperatorArgs, groupedFieldsByPath); + + assertEquals( + new ExplainResponse( + new ExplainResponseNode( + "NestedOperator", + Map.of("nested", Set.of("message.info", "message")), singletonList(tableScan.explainNode()))), explain.apply(plan) ); @@ -246,7 +263,7 @@ public String toString() { public ExplainResponseNode explainNode() { return new ExplainResponseNode( "FakeTableScan", - ImmutableMap.of("request", "Fake DSL request"), + Map.of("request", "Fake DSL request"), emptyList()); } diff --git a/core/src/test/java/org/opensearch/sql/expression/ReferenceExpressionTest.java b/core/src/test/java/org/opensearch/sql/expression/ReferenceExpressionTest.java index d3b44fe6a14..46aae069bbd 100644 --- a/core/src/test/java/org/opensearch/sql/expression/ReferenceExpressionTest.java +++ b/core/src/test/java/org/opensearch/sql/expression/ReferenceExpressionTest.java @@ -35,6 +35,7 @@ import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; +import org.opensearch.sql.data.model.ExprCollectionValue; import org.opensearch.sql.data.model.ExprIntegerValue; import org.opensearch.sql.data.model.ExprStringValue; import org.opensearch.sql.data.model.ExprTupleValue; @@ -126,6 +127,16 @@ public void innner_none_object_field_contain_dot() { assertEquals(1990, actualValue.integerValue()); } + @Test + public void array_with_multiple_path_value() { + ReferenceExpression expr = new ReferenceExpression("message.info", STRING); + ExprValue actualValue = expr.resolve(tuple()); + + assertEquals(STRING, actualValue.type()); + // Array of object, only first index is used + assertEquals("First message in array", actualValue.stringValue()); + } + /** * { * "name": "bob smith" @@ -140,7 +151,11 @@ public void innner_none_object_field_contain_dot() { * }, * "address.local": { * "state": "WA", - * } + * }, + * "message": [ + * { "info": "message in array" }, + * { "info": "Only first index of array used" } + * ] * } */ private ExprTupleValue tuple() { @@ -151,12 +166,29 @@ private ExprTupleValue tuple() { ExprValueUtils.tupleValue(ImmutableMap.of("year", 2020)); ExprValue addressLocal = ExprValueUtils.tupleValue(ImmutableMap.of("state", "WA")); + ExprValue messageCollectionValue = + new ExprCollectionValue( + ImmutableList.of( + ExprValueUtils.tupleValue( + ImmutableMap.of( + "info", stringValue("First message in array") + ) + ), + ExprValueUtils.tupleValue( + ImmutableMap.of( + "info", stringValue("Only first index of array used") + ) + ) + ) + ); + ExprTupleValue tuple = ExprTupleValue.fromExprValueMap(ImmutableMap.of( "name", new ExprStringValue("bob smith"), "project.year", new ExprIntegerValue(1990), "project", project, "address", address, - "address.local", addressLocal + "address.local", addressLocal, + "message", messageCollectionValue )); return tuple; } diff --git a/core/src/test/java/org/opensearch/sql/expression/function/OpenSearchFunctionsTest.java b/core/src/test/java/org/opensearch/sql/expression/function/OpenSearchFunctionsTest.java index 6e4fff2fb0b..d90d8295c43 100644 --- a/core/src/test/java/org/opensearch/sql/expression/function/OpenSearchFunctionsTest.java +++ b/core/src/test/java/org/opensearch/sql/expression/function/OpenSearchFunctionsTest.java @@ -8,24 +8,28 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.opensearch.sql.data.type.ExprCoreType.BOOLEAN; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; -import com.google.common.collect.ImmutableMap; import java.util.LinkedHashMap; import java.util.List; +import java.util.Map; import org.junit.jupiter.api.Test; import org.opensearch.sql.data.model.ExprTupleValue; +import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.data.model.ExprValueUtils; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.ExpressionTestBase; import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.NamedArgumentExpression; +import org.opensearch.sql.expression.env.Environment; public class OpenSearchFunctionsTest extends ExpressionTestBase { private final NamedArgumentExpression field = new NamedArgumentExpression( "field", DSL.literal("message")); private final NamedArgumentExpression fields = new NamedArgumentExpression( - "fields", DSL.literal(new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( + "fields", DSL.literal(new ExprTupleValue(new LinkedHashMap<>(Map.of( "title", ExprValueUtils.floatValue(1.F), "body", ExprValueUtils.floatValue(.3F)))))); private final NamedArgumentExpression query = new NamedArgumentExpression( @@ -205,4 +209,16 @@ void wildcard_query() { field.getValue(), query.getValue()), expr.toString()); } + + @Test + void nested_query() { + FunctionExpression expr = DSL.nested(DSL.ref("message.info", STRING)); + assertEquals(String.format("FunctionExpression(functionName=%s, arguments=[message.info])", + BuiltinFunctionName.NESTED.getName()), + expr.toString()); + Environment nestedTuple = ExprValueUtils.tupleValue( + Map.of("message", Map.of("info", "result"))).bindingTuples(); + assertEquals(expr.valueOf(nestedTuple), ExprValueUtils.stringValue("result")); + assertEquals(expr.type(), STRING); + } } diff --git a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java index da3f5315e46..768ab279311 100644 --- a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java @@ -18,6 +18,7 @@ import static org.opensearch.sql.planner.logical.LogicalPlanDSL.eval; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.filter; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.limit; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.nested; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.project; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.rareTopN; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.remove; @@ -31,6 +32,7 @@ import java.util.Collections; import java.util.List; import java.util.Map; +import java.util.Set; import org.apache.commons.lang3.tuple.ImmutablePair; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.DisplayNameGeneration; @@ -95,59 +97,81 @@ public void visit_should_return_default_physical_operator() { ImmutablePair.of(Sort.SortOption.DEFAULT_ASC, ref("name1", STRING)); Integer limit = 1; Integer offset = 1; + List> nestedArgs = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + List nestedProjectList = + List.of( + new NamedExpression( + "message.info", + DSL.nested(DSL.ref("message.info", STRING)), + null + ) + ); + Set nestedOperatorArgs = Set.of("message.info"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.info")); + LogicalPlan plan = project( - limit( - LogicalPlanDSL.dedupe( - rareTopN( - sort( - eval( - remove( - rename( - aggregation( - filter(values(emptyList()), filterExpr), - aggregators, - groupByExprs), - mappings), - exclude), - newEvalField), - sortField), - CommandType.TOP, - topByExprs, - rareTopNField), - dedupeField), - limit, - offset), + nested( + limit( + LogicalPlanDSL.dedupe( + rareTopN( + sort( + eval( + remove( + rename( + aggregation( + filter(values(emptyList()), filterExpr), + aggregators, + groupByExprs), + mappings), + exclude), + newEvalField), + sortField), + CommandType.TOP, + topByExprs, + rareTopNField), + dedupeField), + limit, + offset), + nestedArgs, nestedProjectList), include); PhysicalPlan actual = plan.accept(implementor, null); assertEquals( PhysicalPlanDSL.project( - PhysicalPlanDSL.limit( - PhysicalPlanDSL.dedupe( - PhysicalPlanDSL.rareTopN( - PhysicalPlanDSL.sort( - PhysicalPlanDSL.eval( - PhysicalPlanDSL.remove( - PhysicalPlanDSL.rename( - PhysicalPlanDSL.agg( - PhysicalPlanDSL.filter( - PhysicalPlanDSL.values(emptyList()), - filterExpr), - aggregators, - groupByExprs), - mappings), - exclude), - newEvalField), - sortField), - CommandType.TOP, - topByExprs, - rareTopNField), - dedupeField), - limit, - offset), + PhysicalPlanDSL.nested( + PhysicalPlanDSL.limit( + PhysicalPlanDSL.dedupe( + PhysicalPlanDSL.rareTopN( + PhysicalPlanDSL.sort( + PhysicalPlanDSL.eval( + PhysicalPlanDSL.remove( + PhysicalPlanDSL.rename( + PhysicalPlanDSL.agg( + PhysicalPlanDSL.filter( + PhysicalPlanDSL.values(emptyList()), + filterExpr), + aggregators, + groupByExprs), + mappings), + exclude), + newEvalField), + sortField), + CommandType.TOP, + topByExprs, + rareTopNField), + dedupeField), + limit, + offset), + nestedOperatorArgs, groupedFieldsByPath), include), actual); } diff --git a/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java b/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java index c9d74fa8712..d31aeff04cc 100644 --- a/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/logical/LogicalPlanNodeVisitorTest.java @@ -9,6 +9,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertNull; import static org.mockito.Mockito.mock; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; import static org.opensearch.sql.expression.DSL.named; import com.google.common.collect.ImmutableList; @@ -31,6 +32,7 @@ import org.opensearch.sql.expression.DSL; import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.LiteralExpression; +import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.expression.aggregation.Aggregator; import org.opensearch.sql.expression.window.WindowDefinition; @@ -114,10 +116,22 @@ public TableWriteOperator build(PhysicalPlan child) { LogicalPlan ad = new LogicalAD(relation, Map.of()); LogicalPlan ml = new LogicalML(relation, Map.of()); LogicalPlan paginate = new LogicalPaginate(42, List.of(relation)); + List> nestedArgs = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + List projectList = + List.of( + new NamedExpression("message.info", DSL.nested(DSL.ref("message.info", STRING)), null) + ); + + LogicalNested nested = new LogicalNested(null, nestedArgs, projectList); return Stream.of( relation, tableScanBuilder, write, tableWriteBuilder, filter, aggregation, rename, project, - remove, eval, sort, dedup, window, rareTopN, highlight, mlCommons, ad, ml, paginate + remove, eval, sort, dedup, window, rareTopN, highlight, mlCommons, ad, ml, paginate, nested ).map(Arguments::of); } diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java index aae05f9da41..2083fdef9cb 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java @@ -18,10 +18,12 @@ import static org.opensearch.sql.data.model.ExprValueUtils.longValue; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; import static org.opensearch.sql.data.type.ExprCoreType.LONG; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.aggregation; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.filter; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.highlight; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.limit; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.nested; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.paginate; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.project; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.relation; @@ -46,6 +48,8 @@ import org.opensearch.sql.ast.tree.Sort; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.NamedExpression; +import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalRelation; @@ -270,6 +274,25 @@ void table_scan_builder_support_highlight_push_down_can_apply_its_rule() { ); } + @Test + void table_scan_builder_support_nested_push_down_can_apply_its_rule() { + when(tableScanBuilder.pushDownNested(any())).thenReturn(true); + + assertEquals( + tableScanBuilder, + optimize( + nested( + relation("schema", table), + List.of(Map.of("field", new ReferenceExpression("message.info", STRING))), + List.of(new NamedExpression( + "message.info", + DSL.nested(DSL.ref("message.info", STRING)), + null)) + ) + ) + ); + } + @Test void table_not_support_scan_builder_should_not_be_impact() { Table table = new Table() { diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/NestedOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/NestedOperatorTest.java new file mode 100644 index 00000000000..58fc0d4566f --- /dev/null +++ b/core/src/test/java/org/opensearch/sql/planner/physical/NestedOperatorTest.java @@ -0,0 +1,347 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.planner.physical; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.contains; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.when; +import static org.opensearch.sql.data.model.ExprValueUtils.collectionValue; +import static org.opensearch.sql.data.model.ExprValueUtils.tupleValue; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; + +import java.util.ArrayList; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Map; +import java.util.Set; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.data.model.ExprNullValue; +import org.opensearch.sql.data.model.ExprValue; +import org.opensearch.sql.expression.ReferenceExpression; + +@ExtendWith(MockitoExtension.class) +class NestedOperatorTest extends PhysicalPlanTestBase { + @Mock + private PhysicalPlan inputPlan; + + private final ExprValue testData = tupleValue( + Map.of( + "message", + collectionValue( + List.of( + Map.of("info", "a"), + Map.of("info", "b"), + Map.of("info", "c") + ) + ), + "comment", + collectionValue( + List.of( + Map.of("data", "1"), + Map.of("data", "2"), + Map.of("data", "3") + ) + ) + ) + ); + + + private final ExprValue testDataWithSamePath = tupleValue( + Map.of( + "message", + collectionValue( + List.of( + Map.of("info", "a", "id", "1"), + Map.of("info", "b", "id", "2"), + Map.of("info", "c", "id", "3") + ) + ) + ) + ); + + private final ExprValue nonNestedTestData = tupleValue( + Map.of( + "message", "val" + ) + ); + + private final ExprValue missingArrayData = tupleValue( + Map.of( + "missing", + collectionValue( + List.of("value") + ) + ) + ); + + @Test + public void nested_one_nested_field() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(testData); + + Set fields = Set.of("message.info"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.info")); + + var nested = new NestedOperator(inputPlan, fields, groupedFieldsByPath); + + assertThat( + execute(nested), + contains( + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "a"); + put("comment", collectionValue( + new ArrayList<>() {{ + add(new LinkedHashMap<>() {{ + put("data", "1"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "2"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "3"); + }} + ); + }} + )); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "b"); + put("comment", collectionValue( + new ArrayList<>() {{ + add(new LinkedHashMap<>() {{ + put("data", "1"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "2"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "3"); + }} + ); + }} + )); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "c"); + put("comment", collectionValue( + new ArrayList<>() {{ + add(new LinkedHashMap<>() {{ + put("data", "1"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "2"); + }} + ); + add(new LinkedHashMap<>() {{ + put("data", "3"); + }} + ); + }} + )); + }} + ) + ) + ); + assertEquals(3, nested.getTotalHits()); + } + + @Test + public void nested_two_nested_field() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(testData); + + List> fields = + List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING)), + Map.of( + "field", new ReferenceExpression("comment.data", STRING), + "path", new ReferenceExpression("comment", STRING)) + ); + var nested = new NestedOperator(inputPlan, fields); + + assertThat( + execute(nested), + contains( + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "a"); + put("comment.data", "1"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "a"); + put("comment.data", "2"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "a"); + put("comment.data", "3"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "b"); + put("comment.data", "1"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "b"); + put("comment.data", "2"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "b"); + put("comment.data", "3"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "c"); + put("comment.data", "1"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "c"); + put("comment.data", "2"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "c"); + put("comment.data", "3"); + }} + ) + ) + ); + assertEquals(9, nested.getTotalHits()); + } + + @Test + public void nested_two_nested_fields_with_same_path() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(testDataWithSamePath); + + List> fields = + List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING)), + Map.of( + "field", new ReferenceExpression("message.id", STRING), + "path", new ReferenceExpression("message", STRING)) + ); + var nested = new NestedOperator(inputPlan, fields); + + assertThat( + execute(nested), + contains( + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "a"); + put("message.id", "1"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "b"); + put("message.id", "2"); + }} + ), + tupleValue( + new LinkedHashMap<>() {{ + put("message.info", "c"); + put("message.id", "3"); + }} + ) + ) + ); + assertEquals(3, nested.getTotalHits()); + } + + @Test + public void non_nested_field_tests() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(nonNestedTestData); + + Set fields = Set.of("message"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.info")); + + var nested = new NestedOperator(inputPlan, fields, groupedFieldsByPath); + assertThat( + execute(nested), + contains( + tupleValue(new LinkedHashMap<>(Map.of("message", "val"))) + ) + ); + assertEquals(1, nested.getTotalHits()); + } + + @Test + public void nested_missing_tuple_field() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(tupleValue(Map.of())); + Set fields = Set.of("message.val"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.val")); + + var nested = new NestedOperator(inputPlan, fields, groupedFieldsByPath); + assertThat( + execute(nested), + contains( + tupleValue(new LinkedHashMap<>(Map.of("message.val", ExprNullValue.of()))) + ) + ); + assertEquals(1, nested.getTotalHits()); + } + + @Test + public void nested_missing_array_field() { + when(inputPlan.hasNext()).thenReturn(true, false); + when(inputPlan.next()) + .thenReturn(missingArrayData); + Set fields = Set.of("missing.data"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.data")); + + var nested = new NestedOperator(inputPlan, fields, groupedFieldsByPath); + assertTrue( + execute(nested) + .get(0) + .tupleValue() + .size() == 0 + ); + assertEquals(1, nested.getTotalHits()); + } +} diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java index 3dfe0b5c0fd..2e6ce64ac62 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java @@ -16,6 +16,9 @@ import com.google.common.base.Strings; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; +import java.util.List; +import java.util.Map; +import java.util.Set; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; @@ -132,6 +135,13 @@ public void test_PhysicalPlanVisitor_should_return_null() { PhysicalPlan limit = PhysicalPlanDSL.limit(plan, 1, 1); assertNull(limit.accept(new PhysicalPlanNodeVisitor() { }, null)); + + Set nestedArgs = Set.of("nested.test"); + Map> groupedFieldsByPath = + Map.of("nested", List.of("nested.test")); + PhysicalPlan nested = new NestedOperator(plan, nestedArgs, groupedFieldsByPath); + assertNull(nested.accept(new PhysicalPlanNodeVisitor() { + }, null)); } @Test diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java index 989cdf7471e..77fcb7a5054 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/ProjectOperatorTest.java @@ -235,7 +235,7 @@ public void serializable() { assertEquals(project, roundTripPlan); } - @EqualsAndHashCode + @EqualsAndHashCode(callSuper = false) public static class TestOperator extends PhysicalPlan implements SerializablePlan { @Override diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScanTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScanTest.java index 0f95f05944a..93c02def86c 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScanTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/datasource/DataSourceTableScanTest.java @@ -62,7 +62,7 @@ void testIterator() { .map(dataSource -> new DataSourceMetadata(dataSource.getName(), dataSource.getConnectorType(), Collections.emptyList(), ImmutableMap.of())) .collect(Collectors.toSet()); - when(dataSourceService.getDataSourceMetadataSet()).thenReturn(dataSourceMetadata); + when(dataSourceService.getDataSourceMetadata(true)).thenReturn(dataSourceMetadata); assertFalse(dataSourceTableScan.hasNext()); dataSourceTableScan.open(); diff --git a/datasources/build.gradle b/datasources/build.gradle new file mode 100644 index 00000000000..ef52db23055 --- /dev/null +++ b/datasources/build.gradle @@ -0,0 +1,82 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +plugins { + id 'java-library' + id "io.freefair.lombok" + id 'jacoco' +} + +repositories { + mavenCentral() +} + +dependencies { + implementation project(':core') + implementation project(':protocol') + implementation group: 'org.opensearch', name: 'opensearch', version: "${opensearch_version}" + implementation group: 'org.opensearch', name: 'opensearch-x-content', version: "${opensearch_version}" + implementation group: 'org.opensearch', name: 'common-utils', version: "${opensearch_build}" + implementation group: 'commons-io', name: 'commons-io', version: '2.8.0' + implementation 'com.amazonaws:aws-encryption-sdk-java:2.4.0' + + testImplementation group: 'junit', name: 'junit', version: '4.13.2' + testImplementation('org.junit.jupiter:junit-jupiter:5.6.2') + testImplementation group: 'net.bytebuddy', name: 'byte-buddy-agent', version: '1.12.13' + testImplementation group: 'org.hamcrest', name: 'hamcrest-library', version: '2.1' + testImplementation group: 'org.mockito', name: 'mockito-core', version: '5.2.0' + testImplementation group: 'org.mockito', name: 'mockito-junit-jupiter', version: '5.2.0' + testImplementation 'org.junit.jupiter:junit-jupiter:5.6.2' +} + +test { + useJUnitPlatform() + testLogging { + events "passed", "skipped", "failed" + exceptionFormat "full" + } +} + +jacocoTestReport { + reports { + html.enabled true + xml.enabled true + } + afterEvaluate { + classDirectories.setFrom(files(classDirectories.files.collect { + fileTree(dir: it) + })) + } +} +test.finalizedBy(project.tasks.jacocoTestReport) + +jacocoTestCoverageVerification { + violationRules { + rule { + element = 'CLASS' + excludes = [ + 'org.opensearch.sql.datasources.settings.DataSourceSettings', + 'org.opensearch.sql.datasources.exceptions.*', + 'org.opensearch.sql.datasources.model.*', + 'org.opensearch.sql.datasources.rest.*' + ] + limit { + counter = 'LINE' + minimum = 1.0 + } + limit { + counter = 'BRANCH' + minimum = 0.9 + } + } + } + afterEvaluate { + classDirectories.setFrom(files(classDirectories.files.collect { + fileTree(dir: it) + })) + } +} +check.dependsOn jacocoTestCoverageVerification +jacocoTestCoverageVerification.dependsOn jacocoTestReport diff --git a/datasources/lombok.config b/datasources/lombok.config new file mode 100644 index 00000000000..aac13295bd7 --- /dev/null +++ b/datasources/lombok.config @@ -0,0 +1,3 @@ +# This file is generated by the 'io.freefair.lombok' Gradle plugin +config.stopBubbling = true +lombok.addLombokGeneratedAnnotation = true \ No newline at end of file diff --git a/core/src/main/java/org/opensearch/sql/datasource/model/auth/AuthenticationType.java b/datasources/src/main/java/org/opensearch/sql/datasources/auth/AuthenticationType.java similarity index 94% rename from core/src/main/java/org/opensearch/sql/datasource/model/auth/AuthenticationType.java rename to datasources/src/main/java/org/opensearch/sql/datasources/auth/AuthenticationType.java index 9cf3e015095..715e72c0c3b 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/model/auth/AuthenticationType.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/auth/AuthenticationType.java @@ -5,7 +5,7 @@ * */ -package org.opensearch.sql.datasource.model.auth; +package org.opensearch.sql.datasources.auth; import java.util.Collections; import java.util.HashMap; diff --git a/core/src/main/java/org/opensearch/sql/datasource/DataSourceUserAuthorizationHelper.java b/datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelper.java similarity index 79% rename from core/src/main/java/org/opensearch/sql/datasource/DataSourceUserAuthorizationHelper.java rename to datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelper.java index dbbe82a527f..adcfb0bdfde 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/DataSourceUserAuthorizationHelper.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelper.java @@ -1,6 +1,10 @@ -package org.opensearch.sql.datasource; +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.datasources.auth; -import java.util.List; import org.opensearch.sql.datasource.model.DataSourceMetadata; /** diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImpl.java b/datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImpl.java similarity index 80% rename from plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImpl.java rename to datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImpl.java index 41ad450f68d..cd55991d006 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImpl.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImpl.java @@ -3,16 +3,15 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.datasource; +package org.opensearch.sql.datasources.auth; -import static org.opensearch.commons.ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; import java.util.List; import lombok.AllArgsConstructor; import org.opensearch.client.Client; +import org.opensearch.commons.ConfigConstants; import org.opensearch.commons.authuser.User; -import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; import org.opensearch.sql.datasource.model.DataSourceMetadata; @AllArgsConstructor @@ -21,13 +20,15 @@ public class DataSourceUserAuthorizationHelperImpl implements DataSourceUserAuth private Boolean isAuthorizationRequired() { String userString = client.threadPool() - .getThreadContext().getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT); + .getThreadContext().getTransient( + ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT); return userString != null; } private List getUserRoles() { String userString = client.threadPool() - .getThreadContext().getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT); + .getThreadContext().getTransient( + ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT); User user = User.parse(userString); return user.getRoles(); } diff --git a/common/src/main/java/org/opensearch/sql/common/encryptor/Encryptor.java b/datasources/src/main/java/org/opensearch/sql/datasources/encryptor/Encryptor.java similarity index 90% rename from common/src/main/java/org/opensearch/sql/common/encryptor/Encryptor.java rename to datasources/src/main/java/org/opensearch/sql/datasources/encryptor/Encryptor.java index a886b723281..55dc1ef18fc 100644 --- a/common/src/main/java/org/opensearch/sql/common/encryptor/Encryptor.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/encryptor/Encryptor.java @@ -5,7 +5,7 @@ * */ -package org.opensearch.sql.common.encryptor; +package org.opensearch.sql.datasources.encryptor; public interface Encryptor { diff --git a/common/src/main/java/org/opensearch/sql/common/encryptor/EncryptorImpl.java b/datasources/src/main/java/org/opensearch/sql/datasources/encryptor/EncryptorImpl.java similarity index 87% rename from common/src/main/java/org/opensearch/sql/common/encryptor/EncryptorImpl.java rename to datasources/src/main/java/org/opensearch/sql/datasources/encryptor/EncryptorImpl.java index 05a0d358fd3..4838cd41a5f 100644 --- a/common/src/main/java/org/opensearch/sql/common/encryptor/EncryptorImpl.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/encryptor/EncryptorImpl.java @@ -5,7 +5,7 @@ * */ -package org.opensearch.sql.common.encryptor; +package org.opensearch.sql.datasources.encryptor; import com.amazonaws.encryptionsdk.AwsCrypto; import com.amazonaws.encryptionsdk.CommitmentPolicy; @@ -29,8 +29,8 @@ public String encrypt(String plainText) { .build(); JceMasterKey jceMasterKey - = JceMasterKey.getInstance(new SecretKeySpec(masterKey.getBytes(), "AES"), "Custom", "", - "AES/GCM/NoPadding"); + = JceMasterKey.getInstance(new SecretKeySpec(masterKey.getBytes(), "AES"), "Custom", + "opensearch.config.master.key", "AES/GCM/NoPadding"); final CryptoResult encryptResult = crypto.encryptData(jceMasterKey, plainText.getBytes(StandardCharsets.UTF_8)); @@ -44,8 +44,8 @@ public String decrypt(String encryptedText) { .build(); JceMasterKey jceMasterKey - = JceMasterKey.getInstance(new SecretKeySpec(masterKey.getBytes(), "AES"), "Custom", "", - "AES/GCM/NoPadding"); + = JceMasterKey.getInstance(new SecretKeySpec(masterKey.getBytes(), "AES"), "Custom", + "opensearch.config.master.key", "AES/GCM/NoPadding"); final CryptoResult decryptedResult = crypto.decryptData(jceMasterKey, Base64.getDecoder().decode(encryptedText)); diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/DataSourceNotFoundException.java b/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/DataSourceNotFoundException.java new file mode 100644 index 00000000000..484b0b92b29 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/DataSourceNotFoundException.java @@ -0,0 +1,18 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.exceptions; + +/** + * DataSourceNotFoundException. + */ +public class DataSourceNotFoundException extends RuntimeException { + public DataSourceNotFoundException(String message) { + super(message); + } + +} diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/ErrorMessage.java b/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/ErrorMessage.java new file mode 100644 index 00000000000..6dbd9bcfb53 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/exceptions/ErrorMessage.java @@ -0,0 +1,78 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + + +package org.opensearch.sql.datasources.exceptions; + +import com.google.gson.Gson; +import com.google.gson.JsonObject; +import lombok.Getter; +import org.opensearch.rest.RestStatus; + +/** + * Error Message. + */ +public class ErrorMessage { + + protected Throwable exception; + + private final int status; + + @Getter + private final String type; + + @Getter + private final String reason; + + @Getter + private final String details; + + /** + * Error Message Constructor. + */ + public ErrorMessage(Throwable exception, int status) { + this.exception = exception; + this.status = status; + + this.type = fetchType(); + this.reason = fetchReason(); + this.details = fetchDetails(); + } + + private String fetchType() { + return exception.getClass().getSimpleName(); + } + + protected String fetchReason() { + return status == RestStatus.BAD_REQUEST.getStatus() + ? "Invalid Request" + : "There was internal problem at backend"; + } + + protected String fetchDetails() { + // Some exception prints internal information (full class name) which is security concern + return emptyStringIfNull(exception.getLocalizedMessage()); + } + + private String emptyStringIfNull(String str) { + return str != null ? str : ""; + } + + @Override + public String toString() { + JsonObject jsonObject = new JsonObject(); + jsonObject.addProperty("status", status); + jsonObject.add("error", getErrorAsJson()); + return new Gson().toJson(jsonObject); + } + + private JsonObject getErrorAsJson() { + JsonObject errorJson = new JsonObject(); + errorJson.addProperty("type", type); + errorJson.addProperty("reason", reason); + errorJson.addProperty("details", details); + return errorJson; + } +} diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionRequest.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionRequest.java similarity index 96% rename from plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionRequest.java rename to datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionRequest.java index d6a15e3a0cd..333564c10a3 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionRequest.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionRequest.java @@ -5,7 +5,7 @@ * */ -package org.opensearch.sql.plugin.model; +package org.opensearch.sql.datasources.model.transport; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionResponse.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionResponse.java similarity index 92% rename from plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionResponse.java rename to datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionResponse.java index 1d8d9aa9b78..4531c3d9fe5 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/model/CreateDataSourceActionResponse.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/CreateDataSourceActionResponse.java @@ -5,7 +5,7 @@ * */ -package org.opensearch.sql.plugin.model; +package org.opensearch.sql.datasources.model.transport; import java.io.IOException; import lombok.Getter; diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionRequest.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionRequest.java new file mode 100644 index 00000000000..6bcbd7a561c --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionRequest.java @@ -0,0 +1,51 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + +import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; + +import java.io.IOException; +import lombok.Getter; +import org.apache.commons.lang3.StringUtils; +import org.opensearch.action.ActionRequest; +import org.opensearch.action.ActionRequestValidationException; +import org.opensearch.common.io.stream.StreamInput; + +public class DeleteDataSourceActionRequest extends ActionRequest { + + @Getter + private String dataSourceName; + + /** Constructor of DeleteDataSourceActionRequest from StreamInput. */ + public DeleteDataSourceActionRequest(StreamInput in) throws IOException { + super(in); + } + + public DeleteDataSourceActionRequest(String dataSourceName) { + this.dataSourceName = dataSourceName; + } + + @Override + public ActionRequestValidationException validate() { + if (StringUtils.isEmpty(this.dataSourceName)) { + ActionRequestValidationException exception = new ActionRequestValidationException(); + exception + .addValidationError("Datasource Name cannot be empty or null"); + return exception; + } else if (this.dataSourceName.equals(DEFAULT_DATASOURCE_NAME)) { + ActionRequestValidationException exception = new ActionRequestValidationException(); + exception + .addValidationError( + "Not allowed to delete datasource with name : " + DEFAULT_DATASOURCE_NAME); + return exception; + } else { + return null; + } + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionResponse.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionResponse.java new file mode 100644 index 00000000000..c6847ed9ed6 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/DeleteDataSourceActionResponse.java @@ -0,0 +1,33 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + +import java.io.IOException; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.opensearch.action.ActionResponse; +import org.opensearch.common.io.stream.StreamInput; +import org.opensearch.common.io.stream.StreamOutput; + +@RequiredArgsConstructor +public class DeleteDataSourceActionResponse extends ActionResponse { + + @Getter + private final String result; + + public DeleteDataSourceActionResponse(StreamInput in) throws IOException { + super(in); + result = in.readString(); + } + + @Override + public void writeTo(StreamOutput streamOutput) throws IOException { + streamOutput.writeString(result); + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionRequest.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionRequest.java new file mode 100644 index 00000000000..6cafe1972ab --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionRequest.java @@ -0,0 +1,49 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + +import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; + +import java.io.IOException; +import lombok.Getter; +import lombok.NoArgsConstructor; +import org.opensearch.action.ActionRequest; +import org.opensearch.action.ActionRequestValidationException; +import org.opensearch.common.io.stream.StreamInput; + +@NoArgsConstructor +public class GetDataSourceActionRequest extends ActionRequest { + + @Getter + private String dataSourceName; + + /** + * Constructor of GetDataSourceActionRequest from StreamInput. + */ + public GetDataSourceActionRequest(StreamInput in) throws IOException { + super(in); + } + + public GetDataSourceActionRequest(String dataSourceName) { + this.dataSourceName = dataSourceName; + } + + @Override + public ActionRequestValidationException validate() { + if (this.dataSourceName != null && this.dataSourceName.equals(DEFAULT_DATASOURCE_NAME)) { + ActionRequestValidationException exception = new ActionRequestValidationException(); + exception + .addValidationError( + "Not allowed to fetch datasource with name : " + DEFAULT_DATASOURCE_NAME); + return exception; + } else { + return null; + } + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionResponse.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionResponse.java new file mode 100644 index 00000000000..030493cb517 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/GetDataSourceActionResponse.java @@ -0,0 +1,33 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + +import java.io.IOException; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.opensearch.action.ActionResponse; +import org.opensearch.common.io.stream.StreamInput; +import org.opensearch.common.io.stream.StreamOutput; + +@RequiredArgsConstructor +public class GetDataSourceActionResponse extends ActionResponse { + + @Getter + private final String result; + + public GetDataSourceActionResponse(StreamInput in) throws IOException { + super(in); + result = in.readString(); + } + + @Override + public void writeTo(StreamOutput streamOutput) throws IOException { + streamOutput.writeString(result); + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionRequest.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionRequest.java new file mode 100644 index 00000000000..fe66483eddc --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionRequest.java @@ -0,0 +1,47 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + + +import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; + +import java.io.IOException; +import lombok.Getter; +import org.opensearch.action.ActionRequest; +import org.opensearch.action.ActionRequestValidationException; +import org.opensearch.common.io.stream.StreamInput; +import org.opensearch.sql.datasource.model.DataSourceMetadata; + +public class UpdateDataSourceActionRequest + extends ActionRequest { + + @Getter + private DataSourceMetadata dataSourceMetadata; + + /** Constructor of UpdateDataSourceActionRequest from StreamInput. */ + public UpdateDataSourceActionRequest(StreamInput in) throws IOException { + super(in); + } + + public UpdateDataSourceActionRequest(DataSourceMetadata dataSourceMetadata) { + this.dataSourceMetadata = dataSourceMetadata; + } + + @Override + public ActionRequestValidationException validate() { + if (this.dataSourceMetadata.getName().equals(DEFAULT_DATASOURCE_NAME)) { + ActionRequestValidationException exception = new ActionRequestValidationException(); + exception + .addValidationError( + "Not allowed to update datasource with name : " + DEFAULT_DATASOURCE_NAME); + return exception; + } else { + return null; + } + } +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionResponse.java b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionResponse.java new file mode 100644 index 00000000000..faa3b1139b2 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/model/transport/UpdateDataSourceActionResponse.java @@ -0,0 +1,33 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.model.transport; + +import java.io.IOException; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.opensearch.action.ActionResponse; +import org.opensearch.common.io.stream.StreamInput; +import org.opensearch.common.io.stream.StreamOutput; + +@RequiredArgsConstructor +public class UpdateDataSourceActionResponse + extends ActionResponse { + + @Getter + private final String result; + + public UpdateDataSourceActionResponse(StreamInput in) throws IOException { + super(in); + result = in.readString(); + } + + @Override + public void writeTo(StreamOutput streamOutput) throws IOException { + streamOutput.writeString(result); + } +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/rest/RestDataSourceQueryAction.java b/datasources/src/main/java/org/opensearch/sql/datasources/rest/RestDataSourceQueryAction.java new file mode 100644 index 00000000000..c75170c3559 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/rest/RestDataSourceQueryAction.java @@ -0,0 +1,249 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.rest; + +import static org.opensearch.rest.RestRequest.Method.DELETE; +import static org.opensearch.rest.RestRequest.Method.GET; +import static org.opensearch.rest.RestRequest.Method.POST; +import static org.opensearch.rest.RestRequest.Method.PUT; +import static org.opensearch.rest.RestStatus.BAD_REQUEST; +import static org.opensearch.rest.RestStatus.NOT_FOUND; +import static org.opensearch.rest.RestStatus.SERVICE_UNAVAILABLE; + +import com.google.common.collect.ImmutableList; +import java.io.IOException; +import java.util.List; +import java.util.Locale; +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.opensearch.action.ActionListener; +import org.opensearch.client.node.NodeClient; +import org.opensearch.rest.BaseRestHandler; +import org.opensearch.rest.BytesRestResponse; +import org.opensearch.rest.RestChannel; +import org.opensearch.rest.RestRequest; +import org.opensearch.rest.RestStatus; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasources.exceptions.DataSourceNotFoundException; +import org.opensearch.sql.datasources.exceptions.ErrorMessage; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionResponse; +import org.opensearch.sql.datasources.transport.TransportCreateDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportDeleteDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportGetDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportUpdateDataSourceAction; +import org.opensearch.sql.datasources.utils.Scheduler; +import org.opensearch.sql.datasources.utils.XContentParserUtils; + + +public class RestDataSourceQueryAction extends BaseRestHandler { + + public static final String DATASOURCE_ACTIONS = "datasource_actions"; + public static final String BASE_DATASOURCE_ACTION_URL = "/_plugins/_query/_datasources"; + + private static final Logger LOG = LogManager.getLogger(RestDataSourceQueryAction.class); + + @Override + public String getName() { + return DATASOURCE_ACTIONS; + } + + @Override + public List routes() { + return ImmutableList.of( + + /* + * + * Create a new datasource. + * Request URL: POST + * Request body: + * Ref [org.opensearch.sql.plugin.transport.datasource.model.CreateDataSourceActionRequest] + * Response body: + * Ref [org.opensearch.sql.plugin.transport.datasource.model.CreateDataSourceActionResponse] + */ + new Route(POST, BASE_DATASOURCE_ACTION_URL), + + /* + * GET datasources + * Request URL: GET + * Request body: + * Ref [org.opensearch.sql.plugin.transport.datasource.model.GetDataSourceActionRequest] + * Response body: + * Ref [org.opensearch.sql.plugin.transport.datasource.model.GetDataSourceActionResponse] + */ + new Route(GET, String.format(Locale.ROOT, "%s/{%s}", + BASE_DATASOURCE_ACTION_URL, "dataSourceName")), + new Route(GET, BASE_DATASOURCE_ACTION_URL), + + /* + * GET datasources + * Request URL: GET + * Request body: + * Ref + * [org.opensearch.sql.plugin.transport.datasource.model.UpdateDataSourceActionRequest] + * Response body: + * Ref + * [org.opensearch.sql.plugin.transport.datasource.model.UpdateDataSourceActionResponse] + */ + new Route(PUT, BASE_DATASOURCE_ACTION_URL), + + /* + * GET datasources + * Request URL: GET + * Request body: Ref + * [org.opensearch.sql.plugin.transport.datasource.model.DeleteDataSourceActionRequest] + * Response body: Ref + * [org.opensearch.sql.plugin.transport.datasource.model.DeleteDataSourceActionResponse] + */ + new Route(DELETE, String.format(Locale.ROOT, "%s/{%s}", + BASE_DATASOURCE_ACTION_URL, "dataSourceName")) + ); + } + + @Override + protected RestChannelConsumer prepareRequest(RestRequest restRequest, NodeClient nodeClient) + throws IOException { + switch (restRequest.method()) { + case POST: + return executePostRequest(restRequest, nodeClient); + case GET: + return executeGetRequest(restRequest, nodeClient); + case PUT: + return executeUpdateRequest(restRequest, nodeClient); + case DELETE: + return executeDeleteRequest(restRequest, nodeClient); + default: + return restChannel + -> restChannel.sendResponse(new BytesRestResponse(RestStatus.METHOD_NOT_ALLOWED, + String.valueOf(restRequest.method()))); + } + } + + private RestChannelConsumer executePostRequest(RestRequest restRequest, + NodeClient nodeClient) throws IOException { + + DataSourceMetadata dataSourceMetadata + = XContentParserUtils.toDataSourceMetadata(restRequest.contentParser()); + return restChannel -> Scheduler.schedule(nodeClient, + () -> nodeClient.execute(TransportCreateDataSourceAction.ACTION_TYPE, + new CreateDataSourceActionRequest(dataSourceMetadata), + new ActionListener<>() { + @Override + public void onResponse( + CreateDataSourceActionResponse createDataSourceActionResponse) { + restChannel.sendResponse( + new BytesRestResponse(RestStatus.CREATED, "application/json; charset=UTF-8", + createDataSourceActionResponse.getResult())); + } + + @Override + public void onFailure(Exception e) { + handleException(e, restChannel); + } + })); + } + + private RestChannelConsumer executeGetRequest(RestRequest restRequest, + NodeClient nodeClient) { + String dataSourceName = restRequest.param("dataSourceName"); + return restChannel -> Scheduler.schedule(nodeClient, + () -> nodeClient.execute(TransportGetDataSourceAction.ACTION_TYPE, + new GetDataSourceActionRequest(dataSourceName), + new ActionListener<>() { + @Override + public void onResponse(GetDataSourceActionResponse getDataSourceActionResponse) { + restChannel.sendResponse( + new BytesRestResponse(RestStatus.OK, "application/json; charset=UTF-8", + getDataSourceActionResponse.getResult())); + } + + @Override + public void onFailure(Exception e) { + handleException(e, restChannel); + } + })); + } + + private RestChannelConsumer executeUpdateRequest(RestRequest restRequest, + NodeClient nodeClient) throws IOException { + DataSourceMetadata dataSourceMetadata + = XContentParserUtils.toDataSourceMetadata(restRequest.contentParser()); + return restChannel -> Scheduler.schedule(nodeClient, + () -> nodeClient.execute(TransportUpdateDataSourceAction.ACTION_TYPE, + new UpdateDataSourceActionRequest(dataSourceMetadata), + new ActionListener<>() { + @Override + public void onResponse( + UpdateDataSourceActionResponse updateDataSourceActionResponse) { + restChannel.sendResponse( + new BytesRestResponse(RestStatus.OK, "application/json; charset=UTF-8", + updateDataSourceActionResponse.getResult())); + } + + @Override + public void onFailure(Exception e) { + handleException(e, restChannel); + } + })); + } + + private RestChannelConsumer executeDeleteRequest(RestRequest restRequest, + NodeClient nodeClient) { + + String dataSourceName = restRequest.param("dataSourceName"); + return restChannel -> Scheduler.schedule(nodeClient, + () -> nodeClient.execute(TransportDeleteDataSourceAction.ACTION_TYPE, + new DeleteDataSourceActionRequest(dataSourceName), + new ActionListener<>() { + @Override + public void onResponse( + DeleteDataSourceActionResponse deleteDataSourceActionResponse) { + restChannel.sendResponse( + new BytesRestResponse(RestStatus.NO_CONTENT, "application/json; charset=UTF-8", + deleteDataSourceActionResponse.getResult())); + } + + @Override + public void onFailure(Exception e) { + handleException(e, restChannel); + } + })); + } + + private void handleException(Exception e, RestChannel restChannel) { + if (e instanceof DataSourceNotFoundException) { + reportError(restChannel, e, NOT_FOUND); + } else { + LOG.error("Error happened during request handling", e); + if (isClientError(e)) { + reportError(restChannel, e, BAD_REQUEST); + } else { + reportError(restChannel, e, SERVICE_UNAVAILABLE); + } + } + } + + private void reportError(final RestChannel channel, final Exception e, final RestStatus status) { + channel.sendResponse( + new BytesRestResponse( + status, new ErrorMessage(e, status.getStatus()).toString())); + } + + private static boolean isClientError(Exception e) { + return e instanceof NullPointerException + // NPE is hard to differentiate but more likely caused by bad query + || e instanceof IllegalArgumentException; + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCache.java b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCache.java new file mode 100644 index 00000000000..3fe2954c129 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCache.java @@ -0,0 +1,20 @@ +package org.opensearch.sql.datasources.service; + +import org.opensearch.sql.datasource.model.DataSource; +import org.opensearch.sql.datasource.model.DataSourceMetadata; + +/** + * Interface for DataSourceLoaderCache which provides methods for + * fetch, loading and invalidating DataSource cache. + */ +public interface DataSourceLoaderCache { + + /** + * Returns cached datasource object or loads a new one if not present. + * + * @param dataSourceMetadata {@link DataSourceMetadata}. + * @return {@link DataSource} + */ + DataSource getOrLoadDataSource(DataSourceMetadata dataSourceMetadata); + +} diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImpl.java b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImpl.java new file mode 100644 index 00000000000..ba9520fc0cd --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImpl.java @@ -0,0 +1,50 @@ +package org.opensearch.sql.datasources.service; + +import com.google.common.cache.Cache; +import com.google.common.cache.CacheBuilder; +import java.util.Map; +import java.util.Set; +import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; +import org.opensearch.sql.datasource.model.DataSource; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.storage.DataSourceFactory; + +/** + * Default implementation of DataSourceLoaderCache. This implementation + * utilizes Google Guava Cache {@link Cache} for caching DataSource objects + * against {@link DataSourceMetadata}. Expires the cache objects every 24 hrs after + * the last access. + */ +public class DataSourceLoaderCacheImpl implements DataSourceLoaderCache { + private final Map dataSourceFactoryMap; + private final Cache dataSourceCache; + + /** + * DataSourceLoaderCacheImpl constructor. + * + * @param dataSourceFactorySet set of {@link DataSourceFactory}. + */ + public DataSourceLoaderCacheImpl(Set dataSourceFactorySet) { + this.dataSourceFactoryMap = dataSourceFactorySet.stream() + .collect(Collectors.toMap(DataSourceFactory::getDataSourceType, f -> f)); + this.dataSourceCache = CacheBuilder.newBuilder() + .maximumSize(1000) + .expireAfterAccess(24, TimeUnit.HOURS) + .build(); + } + + @Override + public DataSource getOrLoadDataSource(DataSourceMetadata dataSourceMetadata) { + DataSource dataSource = this.dataSourceCache.getIfPresent(dataSourceMetadata); + if (dataSource == null) { + dataSource = this.dataSourceFactoryMap.get(dataSourceMetadata.getConnector()) + .createDataSource(dataSourceMetadata); + this.dataSourceCache.put(dataSourceMetadata, dataSource); + return dataSource; + } + return dataSource; + } + +} diff --git a/core/src/main/java/org/opensearch/sql/datasource/DataSourceMetadataStorage.java b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceMetadataStorage.java similarity index 95% rename from core/src/main/java/org/opensearch/sql/datasource/DataSourceMetadataStorage.java rename to datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceMetadataStorage.java index 85ffd0a1b3a..b54af3195e0 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/DataSourceMetadataStorage.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceMetadataStorage.java @@ -5,11 +5,10 @@ * */ -package org.opensearch.sql.datasource; +package org.opensearch.sql.datasources.service; import java.util.List; import java.util.Optional; -import javax.xml.crypto.Data; import org.opensearch.sql.datasource.model.DataSource; import org.opensearch.sql.datasource.model.DataSourceMetadata; diff --git a/core/src/main/java/org/opensearch/sql/datasource/DataSourceServiceImpl.java b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceServiceImpl.java similarity index 60% rename from core/src/main/java/org/opensearch/sql/datasource/DataSourceServiceImpl.java rename to datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceServiceImpl.java index 51bad94af8c..86afa90c2ba 100644 --- a/core/src/main/java/org/opensearch/sql/datasource/DataSourceServiceImpl.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/service/DataSourceServiceImpl.java @@ -3,24 +3,24 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.datasource; +package org.opensearch.sql.datasources.service; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; import com.google.common.base.Preconditions; import com.google.common.base.Strings; +import java.util.HashMap; import java.util.HashSet; import java.util.List; -import java.util.Map; import java.util.Objects; import java.util.Optional; import java.util.Set; -import java.util.concurrent.ConcurrentHashMap; -import java.util.stream.Collectors; import org.opensearch.sql.common.utils.StringUtils; +import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.datasource.model.DataSource; import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.auth.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasources.exceptions.DataSourceNotFoundException; import org.opensearch.sql.storage.DataSourceFactory; /** @@ -35,9 +35,7 @@ public class DataSourceServiceImpl implements DataSourceService { private static String DATASOURCE_NAME_REGEX = "[@*A-Za-z]+?[*a-zA-Z_\\-0-9]*"; - private final ConcurrentHashMap dataSourceMap; - - private final Map dataSourceFactoryMap; + private final DataSourceLoaderCache dataSourceLoaderCache; private final DataSourceMetadataStorage dataSourceMetadataStorage; @@ -50,36 +48,48 @@ public DataSourceServiceImpl(Set dataSourceFactories, DataSourceMetadataStorage dataSourceMetadataStorage, DataSourceUserAuthorizationHelper dataSourceUserAuthorizationHelper) { - dataSourceFactoryMap = - dataSourceFactories.stream() - .collect(Collectors.toMap(DataSourceFactory::getDataSourceType, f -> f)); - dataSourceMap = new ConcurrentHashMap<>(); this.dataSourceMetadataStorage = dataSourceMetadataStorage; this.dataSourceUserAuthorizationHelper = dataSourceUserAuthorizationHelper; + this.dataSourceLoaderCache = new DataSourceLoaderCacheImpl(dataSourceFactories); } @Override - public Set getDataSourceMetadataSet() { + public Set getDataSourceMetadata(boolean isDefaultDataSourceRequired) { List dataSourceMetadataList = this.dataSourceMetadataStorage.getDataSourceMetadata(); Set dataSourceMetadataSet = new HashSet<>(dataSourceMetadataList); - dataSourceMetadataSet.add(DataSourceMetadata.defaultOpenSearchDataSourceMetadata()); + if (isDefaultDataSourceRequired) { + dataSourceMetadataSet.add(DataSourceMetadata.defaultOpenSearchDataSourceMetadata()); + } + removeAuthInfo(dataSourceMetadataSet); return dataSourceMetadataSet; } + @Override + public DataSourceMetadata getDataSourceMetadata(String datasourceName) { + Optional dataSourceMetadataOptional + = getDataSourceMetadataFromName(datasourceName); + if (dataSourceMetadataOptional.isEmpty()) { + throw new IllegalArgumentException("DataSource with name: " + datasourceName + + " doesn't exist."); + } + removeAuthInfo(dataSourceMetadataOptional.get()); + return dataSourceMetadataOptional.get(); + } + @Override public DataSource getDataSource(String dataSourceName) { Optional - dataSourceMetadataOptional = getDataSourceMetadata(dataSourceName); + dataSourceMetadataOptional = getDataSourceMetadataFromName(dataSourceName); if (dataSourceMetadataOptional.isEmpty()) { - throw new IllegalArgumentException( + throw new DataSourceNotFoundException( String.format("DataSource with name %s doesn't exist.", dataSourceName)); } else { DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); this.dataSourceUserAuthorizationHelper .authorizeDataSource(dataSourceMetadata); - return getDataSourceFromMetadata(dataSourceMetadata); + return dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); } } @@ -87,20 +97,31 @@ public DataSource getDataSource(String dataSourceName) { public void createDataSource(DataSourceMetadata metadata) { validateDataSourceMetaData(metadata); if (!metadata.getName().equals(DEFAULT_DATASOURCE_NAME)) { + this.dataSourceLoaderCache.getOrLoadDataSource(metadata); this.dataSourceMetadataStorage.createDataSourceMetadata(metadata); } - dataSourceMap.put(metadata, - dataSourceFactoryMap.get(metadata.getConnector()).createDataSource(metadata)); } @Override public void updateDataSource(DataSourceMetadata dataSourceMetadata) { - throw new UnsupportedOperationException("will be supported in future"); + validateDataSourceMetaData(dataSourceMetadata); + if (!dataSourceMetadata.getName().equals(DEFAULT_DATASOURCE_NAME)) { + this.dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + this.dataSourceMetadataStorage.updateDataSourceMetadata(dataSourceMetadata); + } else { + throw new UnsupportedOperationException( + "Not allowed to update default datasource :" + DEFAULT_DATASOURCE_NAME); + } } @Override public void deleteDataSource(String dataSourceName) { - throw new UnsupportedOperationException("will be supported in future"); + if (dataSourceName.equals(DEFAULT_DATASOURCE_NAME)) { + throw new UnsupportedOperationException( + "Not allowed to delete default datasource :" + DEFAULT_DATASOURCE_NAME); + } else { + this.dataSourceMetadataStorage.deleteDataSourceMetadata(dataSourceName); + } } @Override @@ -130,7 +151,7 @@ private void validateDataSourceMetaData(DataSourceMetadata metadata) { + " Properties are required parameters."); } - private Optional getDataSourceMetadata(String dataSourceName) { + private Optional getDataSourceMetadataFromName(String dataSourceName) { if (dataSourceName.equals(DEFAULT_DATASOURCE_NAME)) { return Optional.of(DataSourceMetadata.defaultOpenSearchDataSourceMetadata()); } else { @@ -138,19 +159,19 @@ private Optional getDataSourceMetadata(String dataSourceName } } - private DataSource getDataSourceFromMetadata(DataSourceMetadata dataSourceMetadata) { - if (!dataSourceMap.containsKey(dataSourceMetadata)) { - clearDataSource(dataSourceMetadata); - dataSourceMap.put(dataSourceMetadata, - dataSourceFactoryMap.get(dataSourceMetadata.getConnector()) - .createDataSource(dataSourceMetadata)); - } - return dataSourceMap.get(dataSourceMetadata); - } - private void clearDataSource(DataSourceMetadata dataSourceMetadata) { - dataSourceMap.entrySet() - .removeIf(entry -> entry.getKey().getName().equals(dataSourceMetadata.getName())); + // It is advised to avoid sending any kind credential + // info in api response from security point of view. + private void removeAuthInfo(Set dataSourceMetadataSet) { + dataSourceMetadataSet.forEach(this::removeAuthInfo); } + private void removeAuthInfo(DataSourceMetadata dataSourceMetadata) { + HashMap safeProperties + = new HashMap<>(dataSourceMetadata.getProperties()); + safeProperties + .entrySet() + .removeIf(entry -> entry.getKey().contains("auth")); + dataSourceMetadata.setProperties(safeProperties); + } } diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceSettings.java b/datasources/src/main/java/org/opensearch/sql/datasources/settings/DataSourceSettings.java similarity index 92% rename from plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceSettings.java rename to datasources/src/main/java/org/opensearch/sql/datasources/settings/DataSourceSettings.java index a451ad30be9..0dc18f409db 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/DataSourceSettings.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/settings/DataSourceSettings.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.datasource; +package org.opensearch.sql.datasources.settings; import java.io.InputStream; import org.opensearch.common.settings.SecureSetting; diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorage.java b/datasources/src/main/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorage.java similarity index 63% rename from plugin/src/main/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorage.java rename to datasources/src/main/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorage.java index b3c433f7e6a..f76e1ba9dca 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorage.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorage.java @@ -5,13 +5,12 @@ * */ -package org.opensearch.sql.plugin.datasource; +package org.opensearch.sql.datasources.storage; import java.io.IOException; import java.io.InputStream; import java.nio.charset.StandardCharsets; import java.util.ArrayList; -import java.util.Arrays; import java.util.Collections; import java.util.List; import java.util.Map; @@ -20,31 +19,42 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.ActionFuture; +import org.opensearch.action.DocWriteRequest; import org.opensearch.action.DocWriteResponse; import org.opensearch.action.admin.indices.create.CreateIndexRequest; import org.opensearch.action.admin.indices.create.CreateIndexResponse; +import org.opensearch.action.delete.DeleteRequest; +import org.opensearch.action.delete.DeleteResponse; import org.opensearch.action.index.IndexRequest; import org.opensearch.action.index.IndexResponse; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; +import org.opensearch.action.support.WriteRequest; +import org.opensearch.action.update.UpdateRequest; +import org.opensearch.action.update.UpdateResponse; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.util.concurrent.ThreadContext; import org.opensearch.common.xcontent.XContentType; +import org.opensearch.index.engine.DocumentMissingException; +import org.opensearch.index.engine.VersionConflictEngineException; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.SearchHit; import org.opensearch.search.builder.SearchSourceBuilder; -import org.opensearch.sql.common.encryptor.Encryptor; -import org.opensearch.sql.datasource.DataSourceMetadataStorage; import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.datasource.model.auth.AuthenticationType; -import org.opensearch.sql.plugin.utils.XContentParserUtils; +import org.opensearch.sql.datasources.auth.AuthenticationType; +import org.opensearch.sql.datasources.encryptor.Encryptor; +import org.opensearch.sql.datasources.exceptions.DataSourceNotFoundException; +import org.opensearch.sql.datasources.service.DataSourceMetadataStorage; +import org.opensearch.sql.datasources.utils.XContentParserUtils; public class OpenSearchDataSourceMetadataStorage implements DataSourceMetadataStorage { public static final String DATASOURCE_INDEX_NAME = ".ql-datasources"; private static final String DATASOURCE_INDEX_MAPPING_FILE_NAME = "datasources-index-mapping.yml"; + + private static final Integer DATASOURCE_QUERY_RESULT_SIZE = 10000; private static final String DATASOURCE_INDEX_SETTINGS_FILE_NAME = "datasources-index-settings.yml"; private static final Logger LOG = LogManager.getLogger(); @@ -81,6 +91,7 @@ public List getDataSourceMetadata() { public Optional getDataSourceMetadata(String datasourceName) { if (!this.clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) { createDataSourcesIndex(); + return Optional.empty(); } return searchInDataSourcesIndex(QueryBuilders.termQuery("name", datasourceName)) .stream() @@ -96,28 +107,77 @@ public void createDataSourceMetadata(DataSourceMetadata dataSourceMetadata) { } IndexRequest indexRequest = new IndexRequest(DATASOURCE_INDEX_NAME); indexRequest.id(dataSourceMetadata.getName()); + indexRequest.opType(DocWriteRequest.OpType.CREATE); + indexRequest.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); ActionFuture indexResponseActionFuture; + IndexResponse indexResponse; try (ThreadContext.StoredContext storedContext = client.threadPool().getThreadContext() .stashContext()) { indexRequest.source(XContentParserUtils.convertToXContent(dataSourceMetadata)); indexResponseActionFuture = client.index(indexRequest); + indexResponse = indexResponseActionFuture.actionGet(); + } catch (VersionConflictEngineException exception) { + throw new IllegalArgumentException("A datasource already exists with name: " + + dataSourceMetadata.getName()); } catch (Exception e) { throw new RuntimeException(e); } - IndexResponse indexResponse = indexResponseActionFuture.actionGet(); + if (indexResponse.getResult().equals(DocWriteResponse.Result.CREATED)) { LOG.debug("DatasourceMetadata : {} successfully created", dataSourceMetadata.getName()); + } else { + throw new RuntimeException("Saving dataSource metadata information failed with result : " + + indexResponse.getResult().getLowercase()); } } @Override public void updateDataSourceMetadata(DataSourceMetadata dataSourceMetadata) { - throw new UnsupportedOperationException("will be supported in future."); + encryptDecryptAuthenticationData(dataSourceMetadata, true); + UpdateRequest updateRequest + = new UpdateRequest(DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); + UpdateResponse updateResponse; + try (ThreadContext.StoredContext storedContext = client.threadPool().getThreadContext() + .stashContext()) { + updateRequest.doc(XContentParserUtils.convertToXContent(dataSourceMetadata)); + updateRequest.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); + ActionFuture updateResponseActionFuture + = client.update(updateRequest); + updateResponse = updateResponseActionFuture.actionGet(); + } catch (DocumentMissingException exception) { + throw new DataSourceNotFoundException("Datasource with name: " + + dataSourceMetadata.getName() + " doesn't exist"); + } catch (Exception e) { + throw new RuntimeException(e); + } + + if (updateResponse.getResult().equals(DocWriteResponse.Result.UPDATED)) { + LOG.debug("DatasourceMetadata : {} successfully updated", dataSourceMetadata.getName()); + } else { + throw new RuntimeException("Saving dataSource metadata information failed with result : " + + updateResponse.getResult().getLowercase()); + } } @Override public void deleteDataSourceMetadata(String datasourceName) { - throw new UnsupportedOperationException("will be supported in future."); + DeleteRequest deleteRequest = new DeleteRequest(DATASOURCE_INDEX_NAME); + deleteRequest.id(datasourceName); + ActionFuture deleteResponseActionFuture; + try (ThreadContext.StoredContext storedContext = client.threadPool().getThreadContext() + .stashContext()) { + deleteResponseActionFuture = client.delete(deleteRequest); + } + DeleteResponse deleteResponse = deleteResponseActionFuture.actionGet(); + if (deleteResponse.getResult().equals(DocWriteResponse.Result.DELETED)) { + LOG.debug("DatasourceMetadata : {} successfully deleted", datasourceName); + } else if (deleteResponse.getResult().equals(DocWriteResponse.Result.NOT_FOUND)) { + throw new DataSourceNotFoundException("Datasource with name: " + + datasourceName + " doesn't exist"); + } else { + throw new RuntimeException("Deleting dataSource metadata information failed with result : " + + deleteResponse.getResult().getLowercase()); + } } private void createDataSourcesIndex() { @@ -127,13 +187,12 @@ private void createDataSourcesIndex() { InputStream settingsFileStream = OpenSearchDataSourceMetadataStorage.class.getClassLoader() .getResourceAsStream(DATASOURCE_INDEX_SETTINGS_FILE_NAME); CreateIndexRequest createIndexRequest = new CreateIndexRequest(DATASOURCE_INDEX_NAME); - createIndexRequest - .mapping(IOUtils.toString(mappingFileStream, StandardCharsets.UTF_8), + createIndexRequest.mapping(IOUtils.toString(mappingFileStream, StandardCharsets.UTF_8), XContentType.YAML) .settings(IOUtils.toString(settingsFileStream, StandardCharsets.UTF_8), XContentType.YAML); ActionFuture createIndexResponseActionFuture; - try (ThreadContext.StoredContext storedContext = client.threadPool().getThreadContext() + try (ThreadContext.StoredContext ignored = client.threadPool().getThreadContext() .stashContext()) { createIndexResponseActionFuture = client.admin().indices().create(createIndexRequest); } @@ -141,12 +200,12 @@ private void createDataSourcesIndex() { if (createIndexResponse.isAcknowledged()) { LOG.info("Index: {} creation Acknowledged", DATASOURCE_INDEX_NAME); } else { - throw new IllegalStateException( - String.format("Index: %s creation failed", DATASOURCE_INDEX_NAME)); + throw new RuntimeException( + "Index creation is not acknowledged."); } } catch (Throwable e) { throw new RuntimeException( - "Internal server error while creating" + DATASOURCE_INDEX_NAME + " index" + "Internal server error while creating" + DATASOURCE_INDEX_NAME + " index:: " + e.getMessage()); } } @@ -156,20 +215,21 @@ private List searchInDataSourcesIndex(QueryBuilder query) { searchRequest.indices(DATASOURCE_INDEX_NAME); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); searchSourceBuilder.query(query); + searchSourceBuilder.size(DATASOURCE_QUERY_RESULT_SIZE); searchRequest.source(searchSourceBuilder); ActionFuture searchResponseActionFuture; - try (ThreadContext.StoredContext storedContext = client.threadPool().getThreadContext() + try (ThreadContext.StoredContext ignored = client.threadPool().getThreadContext() .stashContext()) { searchResponseActionFuture = client.search(searchRequest); } SearchResponse searchResponse = searchResponseActionFuture.actionGet(); if (searchResponse.status().getStatus() != 200) { - throw new RuntimeException( - "Internal server error while fetching datasource metadata information"); + throw new RuntimeException("Fetching dataSource metadata information failed with status : " + + searchResponse.status()); } else { List list = new ArrayList<>(); - for (SearchHit documentFields : searchResponse.getHits().getHits()) { - String sourceAsString = documentFields.getSourceAsString(); + for (SearchHit searchHit : searchResponse.getHits().getHits()) { + String sourceAsString = searchHit.getSourceAsString(); DataSourceMetadata dataSourceMetadata; try { dataSourceMetadata = XContentParserUtils.toDataSourceMetadata(sourceAsString); @@ -182,6 +242,7 @@ private List searchInDataSourcesIndex(QueryBuilder query) { } } + @SuppressWarnings("missingswitchdefault") private DataSourceMetadata encryptDecryptAuthenticationData(DataSourceMetadata dataSourceMetadata, Boolean isEncryption) { Map propertiesMap = dataSourceMetadata.getProperties(); @@ -198,8 +259,6 @@ private DataSourceMetadata encryptDecryptAuthenticationData(DataSourceMetadata d case AWSSIGV4AUTH: handleSigV4PropertiesEncryptionDecryption(propertiesMap, isEncryption); break; - default: - break; } } return dataSourceMetadata; @@ -207,14 +266,16 @@ private DataSourceMetadata encryptDecryptAuthenticationData(DataSourceMetadata d private void handleBasicAuthPropertiesEncryptionDecryption(Map propertiesMap, Boolean isEncryption) { - Optional usernameKey = propertiesMap.keySet().stream() + ArrayList list = new ArrayList<>(); + propertiesMap.keySet().stream() .filter(s -> s.endsWith("auth.username")) - .findFirst(); - Optional passwordKey = propertiesMap.keySet().stream() + .findFirst() + .ifPresent(list::add); + propertiesMap.keySet().stream() .filter(s -> s.endsWith("auth.password")) - .findFirst(); - encryptOrDecrypt(propertiesMap, isEncryption, - Arrays.asList(usernameKey.get(), passwordKey.get())); + .findFirst() + .ifPresent(list::add); + encryptOrDecrypt(propertiesMap, isEncryption, list); } private void encryptOrDecrypt(Map propertiesMap, Boolean isEncryption, @@ -232,13 +293,16 @@ private void encryptOrDecrypt(Map propertiesMap, Boolean isEncry private void handleSigV4PropertiesEncryptionDecryption(Map propertiesMap, Boolean isEncryption) { - Optional accessKey = propertiesMap.keySet().stream() + ArrayList list = new ArrayList<>(); + propertiesMap.keySet().stream() .filter(s -> s.endsWith("auth.access_key")) - .findFirst(); - Optional secretKey = propertiesMap.keySet().stream() + .findFirst() + .ifPresent(list::add); + propertiesMap.keySet().stream() .filter(s -> s.endsWith("auth.secret_key")) - .findFirst(); - encryptOrDecrypt(propertiesMap, isEncryption, Arrays.asList(accessKey.get(), secretKey.get())); + .findFirst() + .ifPresent(list::add); + encryptOrDecrypt(propertiesMap, isEncryption, list); } } \ No newline at end of file diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/transport/datasource/TransportCreateDataSourceAction.java b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceAction.java similarity index 57% rename from plugin/src/main/java/org/opensearch/sql/plugin/transport/datasource/TransportCreateDataSourceAction.java rename to datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceAction.java index 006837c2560..4d8c51fac70 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/transport/datasource/TransportCreateDataSourceAction.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceAction.java @@ -5,69 +5,56 @@ * */ -package org.opensearch.sql.plugin.transport.datasource; +package org.opensearch.sql.datasources.transport; -import org.apache.logging.log4j.LogManager; -import org.apache.logging.log4j.Logger; import org.opensearch.action.ActionListener; import org.opensearch.action.ActionType; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.client.Client; -import org.opensearch.client.node.NodeClient; import org.opensearch.common.inject.Inject; import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.legacy.metrics.MetricName; -import org.opensearch.sql.legacy.metrics.Metrics; -import org.opensearch.sql.plugin.model.CreateDataSourceActionRequest; -import org.opensearch.sql.plugin.model.CreateDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; import org.opensearch.tasks.Task; import org.opensearch.transport.TransportService; public class TransportCreateDataSourceAction extends HandledTransportAction { - - private static final Logger LOG = LogManager.getLogger(); - public static final String NAME = "cluster:admin/opensearch/datasources/create"; + public static final String NAME = "cluster:admin/opensearch/ql/datasources/create"; public static final ActionType ACTION_TYPE = new ActionType<>(NAME, CreateDataSourceActionResponse::new); private DataSourceService dataSourceService; - private Client client; /** * TransportCreateDataSourceAction action for creating datasource. * * @param transportService transportService. * @param actionFilters actionFilters. - * @param client client. * @param dataSourceService dataSourceService. */ @Inject public TransportCreateDataSourceAction(TransportService transportService, ActionFilters actionFilters, - NodeClient client, DataSourceServiceImpl dataSourceService) { super(TransportCreateDataSourceAction.NAME, transportService, actionFilters, CreateDataSourceActionRequest::new); this.dataSourceService = dataSourceService; - this.client = client; } @Override protected void doExecute(Task task, CreateDataSourceActionRequest request, ActionListener actionListener) { - - Metrics.getInstance().getNumericalMetric(MetricName.DATASOURCE_REQ_COUNT).increment(); - actionListener.onResponse(execute(request.getDataSourceMetadata())); - } - - private CreateDataSourceActionResponse execute(DataSourceMetadata dataSourceMetadata) { - dataSourceService.createDataSource(dataSourceMetadata); - return new CreateDataSourceActionResponse("Created DataSource with name " - + dataSourceMetadata.getName()); + try { + DataSourceMetadata dataSourceMetadata = request.getDataSourceMetadata(); + dataSourceService.createDataSource(dataSourceMetadata); + actionListener.onResponse(new CreateDataSourceActionResponse("Created DataSource with name " + + dataSourceMetadata.getName())); + } catch (Exception e) { + actionListener.onFailure(e); + } } } \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceAction.java b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceAction.java new file mode 100644 index 00000000000..1d109ca7fc7 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceAction.java @@ -0,0 +1,59 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.transport; + +import org.opensearch.action.ActionListener; +import org.opensearch.action.ActionType; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.action.support.HandledTransportAction; +import org.opensearch.common.inject.Inject; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +public class TransportDeleteDataSourceAction + extends HandledTransportAction { + + public static final String NAME = "cluster:admin/opensearch/ql/datasources/delete"; + public static final ActionType + ACTION_TYPE = new ActionType<>(NAME, DeleteDataSourceActionResponse::new); + + private DataSourceService dataSourceService; + + /** + * TransportDeleteDataSourceAction action for deleting datasource. + * + * @param transportService transportService. + * @param actionFilters actionFilters. + * @param dataSourceService dataSourceService. + */ + @Inject + public TransportDeleteDataSourceAction(TransportService transportService, + ActionFilters actionFilters, + DataSourceServiceImpl dataSourceService) { + super(TransportDeleteDataSourceAction.NAME, transportService, actionFilters, + DeleteDataSourceActionRequest::new); + this.dataSourceService = dataSourceService; + } + + @Override + protected void doExecute(Task task, DeleteDataSourceActionRequest request, + ActionListener actionListener) { + try { + dataSourceService.deleteDataSource(request.getDataSourceName()); + actionListener.onResponse(new DeleteDataSourceActionResponse("Deleted DataSource with name " + + request.getDataSourceName())); + } catch (Exception e) { + actionListener.onFailure(e); + } + } + +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceAction.java b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceAction.java new file mode 100644 index 00000000000..33d08f7cd29 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceAction.java @@ -0,0 +1,95 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.transport; + +import java.util.Set; +import org.opensearch.action.ActionListener; +import org.opensearch.action.ActionType; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.action.support.HandledTransportAction; +import org.opensearch.common.inject.Inject; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.sql.protocol.response.format.JsonResponseFormatter; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +public class TransportGetDataSourceAction + extends HandledTransportAction { + + public static final String NAME = "cluster:admin/opensearch/ql/datasources/read"; + public static final ActionType + ACTION_TYPE = new ActionType<>(NAME, GetDataSourceActionResponse::new); + + private DataSourceService dataSourceService; + + /** + * TransportGetDataSourceAction action for getting datasource. + * + * @param transportService transportService. + * @param actionFilters actionFilters. + * @param dataSourceService dataSourceService. + */ + @Inject + public TransportGetDataSourceAction(TransportService transportService, + ActionFilters actionFilters, + DataSourceServiceImpl dataSourceService) { + super(TransportGetDataSourceAction.NAME, transportService, actionFilters, + GetDataSourceActionRequest::new); + this.dataSourceService = dataSourceService; + } + + @Override + protected void doExecute(Task task, GetDataSourceActionRequest request, + ActionListener actionListener) { + try { + String responseContent; + if (request.getDataSourceName() == null) { + responseContent = handleGetAllDataSourcesRequest(); + + } else { + responseContent = handleSingleDataSourceRequest(request.getDataSourceName()); + } + actionListener.onResponse(new GetDataSourceActionResponse(responseContent)); + } catch (Exception e) { + actionListener.onFailure(e); + } + } + + private String handleGetAllDataSourcesRequest() { + String responseContent; + Set dataSourceMetadataSet = + dataSourceService.getDataSourceMetadata(false); + responseContent = new JsonResponseFormatter>( + JsonResponseFormatter.Style.PRETTY) { + @Override + protected Object buildJsonObject(Set response) { + return response; + } + }.format(dataSourceMetadataSet); + return responseContent; + } + + private String handleSingleDataSourceRequest(String datasourceName) { + String responseContent; + DataSourceMetadata dataSourceMetadata + = dataSourceService + .getDataSourceMetadata(datasourceName); + responseContent = new JsonResponseFormatter( + JsonResponseFormatter.Style.PRETTY) { + @Override + protected Object buildJsonObject(DataSourceMetadata response) { + return response; + } + }.format(dataSourceMetadata); + return responseContent; + } +} \ No newline at end of file diff --git a/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceAction.java b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceAction.java new file mode 100644 index 00000000000..4aece69e5b2 --- /dev/null +++ b/datasources/src/main/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceAction.java @@ -0,0 +1,59 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.transport; + +import org.opensearch.action.ActionListener; +import org.opensearch.action.ActionType; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.action.support.HandledTransportAction; +import org.opensearch.common.inject.Inject; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +public class TransportUpdateDataSourceAction + extends HandledTransportAction { + + public static final String NAME = "cluster:admin/opensearch/ql/datasources/update"; + public static final ActionType + ACTION_TYPE = new ActionType<>(NAME, UpdateDataSourceActionResponse::new); + + private DataSourceService dataSourceService; + + /** + * TransportUpdateDataSourceAction action for updating datasource. + * + * @param transportService transportService. + * @param actionFilters actionFilters. + * @param dataSourceService dataSourceService. + */ + @Inject + public TransportUpdateDataSourceAction(TransportService transportService, + ActionFilters actionFilters, + DataSourceServiceImpl dataSourceService) { + super(TransportUpdateDataSourceAction.NAME, transportService, actionFilters, + UpdateDataSourceActionRequest::new); + this.dataSourceService = dataSourceService; + } + + @Override + protected void doExecute(Task task, UpdateDataSourceActionRequest request, + ActionListener actionListener) { + try { + dataSourceService.updateDataSource(request.getDataSourceMetadata()); + actionListener.onResponse(new UpdateDataSourceActionResponse("Updated DataSource with name " + + request.getDataSourceMetadata().getName())); + } catch (Exception e) { + actionListener.onFailure(e); + } + } + +} \ No newline at end of file diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/utils/Scheduler.java b/datasources/src/main/java/org/opensearch/sql/datasources/utils/Scheduler.java similarity index 95% rename from plugin/src/main/java/org/opensearch/sql/plugin/utils/Scheduler.java rename to datasources/src/main/java/org/opensearch/sql/datasources/utils/Scheduler.java index a4a87b1b125..0bc597ed4ff 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/utils/Scheduler.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/utils/Scheduler.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.utils; +package org.opensearch.sql.datasources.utils; import java.util.Map; import lombok.experimental.UtilityClass; diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/utils/XContentParserUtils.java b/datasources/src/main/java/org/opensearch/sql/datasources/utils/XContentParserUtils.java similarity index 94% rename from plugin/src/main/java/org/opensearch/sql/plugin/utils/XContentParserUtils.java rename to datasources/src/main/java/org/opensearch/sql/datasources/utils/XContentParserUtils.java index cc1310ffc3a..a8643a35f3a 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/utils/XContentParserUtils.java +++ b/datasources/src/main/java/org/opensearch/sql/datasources/utils/XContentParserUtils.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.utils; +package org.opensearch.sql.datasources.utils; import static org.opensearch.common.xcontent.XContentParserUtils.ensureExpectedToken; @@ -12,6 +12,7 @@ import java.util.HashMap; import java.util.List; import java.util.Map; +import lombok.experimental.UtilityClass; import org.opensearch.common.xcontent.XContentFactory; import org.opensearch.common.xcontent.XContentType; import org.opensearch.core.xcontent.DeprecationHandler; @@ -21,6 +22,10 @@ import org.opensearch.sql.datasource.model.DataSourceMetadata; import org.opensearch.sql.datasource.model.DataSourceType; +/** + * Utitlity class to serialize and deserialize objects in XContent. + */ +@UtilityClass public class XContentParserUtils { public static final String NAME_FIELD = "name"; public static final String CONNECTOR_FIELD = "connector"; @@ -69,7 +74,7 @@ public static DataSourceMetadata toDataSourceMetadata(XContentParser parser) thr } } if (name == null || connector == null) { - throw new IllegalArgumentException("Missing required fields"); + throw new IllegalArgumentException("name and connector are required fields."); } return new DataSourceMetadata(name, connector, allowedRoles, properties); } diff --git a/plugin/src/main/resources/datasources-index-mapping.yml b/datasources/src/main/resources/datasources-index-mapping.yml similarity index 100% rename from plugin/src/main/resources/datasources-index-mapping.yml rename to datasources/src/main/resources/datasources-index-mapping.yml diff --git a/plugin/src/main/resources/datasources-index-settings.yml b/datasources/src/main/resources/datasources-index-settings.yml similarity index 100% rename from plugin/src/main/resources/datasources-index-settings.yml rename to datasources/src/main/resources/datasources-index-settings.yml diff --git a/core/src/test/java/org/opensearch/sql/datasource/model/auth/AuthenticationTypeTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/auth/AuthenticationTypeTest.java similarity index 93% rename from core/src/test/java/org/opensearch/sql/datasource/model/auth/AuthenticationTypeTest.java rename to datasources/src/test/java/org/opensearch/sql/datasources/auth/AuthenticationTypeTest.java index f9e4f3ce591..23bb4688e12 100644 --- a/core/src/test/java/org/opensearch/sql/datasource/model/auth/AuthenticationTypeTest.java +++ b/datasources/src/test/java/org/opensearch/sql/datasources/auth/AuthenticationTypeTest.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.datasource.model.auth; +package org.opensearch.sql.datasources.auth; import static org.junit.jupiter.api.Assertions.assertEquals; diff --git a/plugin/src/test/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImplTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImplTest.java similarity index 61% rename from plugin/src/test/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImplTest.java rename to datasources/src/test/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImplTest.java index fe57b06c6e0..552bd0edf97 100644 --- a/plugin/src/test/java/org/opensearch/sql/plugin/datasource/DataSourceUserAuthorizationHelperImplTest.java +++ b/datasources/src/test/java/org/opensearch/sql/datasources/auth/DataSourceUserAuthorizationHelperImplTest.java @@ -3,25 +3,25 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.datasource; +package org.opensearch.sql.datasources.auth; -import static org.mockito.Mockito.when; import static org.opensearch.commons.ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT; import java.util.HashMap; import java.util.List; import org.junit.Assert; -import org.junit.Test; -import org.junit.runner.RunWith; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Answers; import org.mockito.InjectMocks; import org.mockito.Mock; -import org.mockito.junit.MockitoJUnitRunner; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.client.Client; import org.opensearch.sql.datasource.model.DataSourceMetadata; import org.opensearch.sql.datasource.model.DataSourceType; -@RunWith(MockitoJUnitRunner.class) +@ExtendWith(MockitoExtension.class) public class DataSourceUserAuthorizationHelperImplTest { @Mock(answer = Answers.RETURNS_DEEP_STUBS) @@ -34,8 +34,8 @@ public class DataSourceUserAuthorizationHelperImplTest { @Test public void testAuthorizeDataSourceWithAllowedRoles() { String userString = "myuser|bckrole1,bckrol2|prometheus_access|myTenant"; - when(client.threadPool().getThreadContext() - .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) + Mockito.when(client.threadPool().getThreadContext() + .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) .thenReturn(userString); DataSourceMetadata dataSourceMetadata = dataSourceMetadata(); this.dataSourceUserAuthorizationHelper @@ -45,19 +45,41 @@ public void testAuthorizeDataSourceWithAllowedRoles() { @Test public void testAuthorizeDataSourceWithAdminRole() { String userString = "myuser|bckrole1,bckrol2|all_access|myTenant"; - when(client.threadPool().getThreadContext() - .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) + Mockito.when(client.threadPool().getThreadContext() + .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) .thenReturn(userString); DataSourceMetadata dataSourceMetadata = dataSourceMetadata(); this.dataSourceUserAuthorizationHelper .authorizeDataSource(dataSourceMetadata); } + @Test + public void testAuthorizeDataSourceWithNullUserString() { + Mockito.when(client.threadPool().getThreadContext() + .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) + .thenReturn(null); + DataSourceMetadata dataSourceMetadata = dataSourceMetadata(); + this.dataSourceUserAuthorizationHelper + .authorizeDataSource(dataSourceMetadata); + } + + @Test + public void testAuthorizeDataSourceWithDefaultDataSource() { + String userString = "myuser|bckrole1,bckrol2|role1|myTenant"; + Mockito.when(client.threadPool().getThreadContext() + .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) + .thenReturn(userString); + DataSourceMetadata dataSourceMetadata = + DataSourceMetadata.defaultOpenSearchDataSourceMetadata(); + this.dataSourceUserAuthorizationHelper + .authorizeDataSource(dataSourceMetadata); + } + @Test public void testAuthorizeDataSourceWithException() { String userString = "myuser|bckrole1,bckrol2|role1|myTenant"; - when(client.threadPool().getThreadContext() - .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) + Mockito.when(client.threadPool().getThreadContext() + .getTransient(OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT)) .thenReturn(userString); DataSourceMetadata dataSourceMetadata = dataSourceMetadata(); SecurityException securityException diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/encryptor/EncryptorImplTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/encryptor/EncryptorImplTest.java new file mode 100644 index 00000000000..22f5b092554 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/encryptor/EncryptorImplTest.java @@ -0,0 +1,87 @@ +/* + * + * * Copyright OpenSearch Contributors + * * SPDX-License-Identifier: Apache-2.0 + * + */ + +package org.opensearch.sql.datasources.encryptor; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; + +import com.amazonaws.encryptionsdk.exception.AwsCryptoException; +import com.amazonaws.encryptionsdk.exception.BadCiphertextException; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.junit.jupiter.MockitoExtension; + + +@ExtendWith(MockitoExtension.class) +public class EncryptorImplTest { + + @Test + public void testEncryptAndDecrypt() { + String masterKey = "1234567890123456"; + String input = "This is a test input"; + Encryptor encryptor = new EncryptorImpl(masterKey); + + String encrypted = encryptor.encrypt(input); + String decrypted = encryptor.decrypt(encrypted); + + assertEquals(input, decrypted); + } + + @Test + public void testMasterKeySize() { + String input = "This is a test input"; + String masterKey8 = "12345678"; + Encryptor encryptor8 = new EncryptorImpl(masterKey8); + assertThrows(AwsCryptoException.class, () -> { + encryptor8.encrypt(input); + }); + + String masterKey16 = "1234567812345678"; + Encryptor encryptor16 = new EncryptorImpl(masterKey16); + String encrypted = encryptor16.encrypt(input); + Assertions.assertEquals(input, encryptor16.decrypt(encrypted)); + + String masterKey24 = "123456781234567812345678"; + Encryptor encryptor24 = new EncryptorImpl(masterKey24); + encrypted = encryptor24.encrypt(input); + Assertions.assertEquals(input, encryptor24.decrypt(encrypted)); + + String masterKey17 = "12345678123456781"; + Encryptor encryptor17 = new EncryptorImpl(masterKey17); + assertThrows(AwsCryptoException.class, () -> { + encryptor17.encrypt(input); + }); + } + + @Test + public void testInvalidBase64String() { + String encrypted = "invalidBase64String"; + Encryptor encryptor = new EncryptorImpl("randomMasterKey"); + + assertThrows(BadCiphertextException.class, () -> { + encryptor.decrypt(encrypted); + }); + } + + @Test + public void testDecryptWithDifferentKey() { + + String masterKeyOne = "1234567890123456"; + String masterKeyTwo = "1234567890123455"; + String input = "This is a test input"; + Encryptor encryptor1 = new EncryptorImpl(masterKeyOne); + Encryptor encryptor2 = new EncryptorImpl(masterKeyTwo); + + String encrypted = encryptor1.encrypt(input); + + assertThrows(Exception.class, () -> { + encryptor2.decrypt(encrypted); + }); + } +} \ No newline at end of file diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImplTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImplTest.java new file mode 100644 index 00000000000..bf656857b02 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceLoaderCacheImplTest.java @@ -0,0 +1,85 @@ +package org.opensearch.sql.datasources.service; + +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoMoreInteractions; +import static org.mockito.Mockito.when; + +import com.google.common.collect.ImmutableMap; +import java.util.Collections; +import java.util.List; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.datasource.model.DataSource; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.storage.DataSourceFactory; +import org.opensearch.sql.storage.StorageEngine; + +@ExtendWith(MockitoExtension.class) +class DataSourceLoaderCacheImplTest { + + @Mock + private DataSourceFactory dataSourceFactory; + + @Mock + private StorageEngine storageEngine; + + @BeforeEach + public void setup() { + lenient() + .doAnswer( + invocation -> { + DataSourceMetadata metadata = invocation.getArgument(0); + return new DataSource(metadata.getName(), metadata.getConnector(), storageEngine); + }) + .when(dataSourceFactory) + .createDataSource(any()); + when(dataSourceFactory.getDataSourceType()).thenReturn(DataSourceType.OPENSEARCH); + } + + @Test + void testGetOrLoadDataSource() { + DataSourceLoaderCache dataSourceLoaderCache = + new DataSourceLoaderCacheImpl(Collections.singleton(dataSourceFactory)); + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.OPENSEARCH); + dataSourceMetadata.setAllowedRoles(Collections.emptyList()); + dataSourceMetadata.setProperties(ImmutableMap.of()); + DataSource dataSource = dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + verify(dataSourceFactory, times(1)).createDataSource(dataSourceMetadata); + Assertions.assertEquals(dataSource, + dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata)); + verifyNoMoreInteractions(dataSourceFactory); + } + + @Test + void testGetOrLoadDataSourceWithMetadataUpdate() { + DataSourceLoaderCache dataSourceLoaderCache = + new DataSourceLoaderCacheImpl(Collections.singleton(dataSourceFactory)); + DataSourceMetadata dataSourceMetadata = getMetadata(); + dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + dataSourceMetadata.setAllowedRoles(List.of("testDS_access")); + dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + dataSourceLoaderCache.getOrLoadDataSource(dataSourceMetadata); + verify(dataSourceFactory, times(2)).createDataSource(dataSourceMetadata); + } + + private DataSourceMetadata getMetadata() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.OPENSEARCH); + dataSourceMetadata.setAllowedRoles(Collections.emptyList()); + dataSourceMetadata.setProperties(ImmutableMap.of()); + return dataSourceMetadata; + } + +} diff --git a/core/src/test/java/org/opensearch/sql/datasource/DataSourceServiceImplTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceServiceImplTest.java similarity index 62% rename from core/src/test/java/org/opensearch/sql/datasource/DataSourceServiceImplTest.java rename to datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceServiceImplTest.java index 68a9475f76b..e1312ec582c 100644 --- a/core/src/test/java/org/opensearch/sql/datasource/DataSourceServiceImplTest.java +++ b/datasources/src/test/java/org/opensearch/sql/datasources/service/DataSourceServiceImplTest.java @@ -3,9 +3,10 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.datasource; +package org.opensearch.sql.datasources.service; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertThrows; import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.ArgumentMatchers.any; @@ -15,13 +16,13 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; -import static org.mockito.Mockito.verifyNoMoreInteractions; import static org.mockito.Mockito.when; import static org.opensearch.sql.analysis.DataSourceSchemaIdentifierNameResolver.DEFAULT_DATASOURCE_NAME; import com.google.common.collect.ImmutableMap; import java.util.ArrayList; import java.util.Collections; +import java.util.HashMap; import java.util.HashSet; import java.util.List; import java.util.Map; @@ -33,17 +34,18 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.sql.datasource.DataSourceService; import org.opensearch.sql.datasource.model.DataSource; import org.opensearch.sql.datasource.model.DataSourceMetadata; import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.auth.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasources.exceptions.DataSourceNotFoundException; import org.opensearch.sql.storage.DataSourceFactory; import org.opensearch.sql.storage.StorageEngine; @ExtendWith(MockitoExtension.class) class DataSourceServiceImplTest { - static final String NAME = "opensearch"; - @Mock private DataSourceFactory dataSourceFactory; @Mock @@ -91,9 +93,9 @@ void testGetDataSourceForDefaultOpenSearchDataSource() { void testGetDataSourceForNonExistingDataSource() { when(dataSourceMetadataStorage.getDataSourceMetadata("test")) .thenReturn(Optional.empty()); - IllegalArgumentException exception = + DataSourceNotFoundException exception = assertThrows( - IllegalArgumentException.class, + DataSourceNotFoundException.class, () -> dataSourceService.getDataSource("test")); assertEquals("DataSource with name test doesn't exist.", exception.getMessage()); @@ -130,7 +132,7 @@ void testGetDataSourceWithAuthorizationFailure() { SecurityException securityException = Assertions.assertThrows(SecurityException.class, - () -> dataSourceService.getDataSource("test")); + () -> dataSourceService.getDataSource("test")); Assertions.assertEquals("User is not authorized to access datasource test. " + "User should be mapped to any of the roles in [prometheus_access] for access.", securityException.getMessage()); @@ -158,7 +160,6 @@ void testCreateDataSourceSuccessCase() { assertEquals("testDS", dataSource.getName()); assertEquals(storageEngine, dataSource.getStorageEngine()); assertEquals(DataSourceType.OPENSEARCH, dataSource.getConnectorType()); - verifyNoMoreInteractions(dataSourceFactory); } @Test @@ -213,6 +214,32 @@ void testCreateDataSourceWithNullParameters() { @Test void testGetDataSourceMetadataSet() { + HashMap properties = new HashMap<>(); + properties.put("prometheus.uri", "http://localhost:9200"); + properties.put("prometheus.auth.type", "basicauth"); + properties.put("prometheus.auth.username", "username"); + properties.put("prometheus.auth.password", "password"); + when(dataSourceMetadataStorage.getDataSourceMetadata()).thenReturn(new ArrayList<>() { + { + add(metadata("testDS", DataSourceType.PROMETHEUS, Collections.emptyList(), + properties)); + } + }); + Set dataSourceMetadataSet + = dataSourceService.getDataSourceMetadata(false); + assertEquals(1, dataSourceMetadataSet.size()); + DataSourceMetadata dataSourceMetadata = dataSourceMetadataSet.iterator().next(); + assertTrue(dataSourceMetadata.getProperties().containsKey("prometheus.uri")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.type")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.username")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.password")); + assertFalse(dataSourceMetadataSet + .contains(DataSourceMetadata.defaultOpenSearchDataSourceMetadata())); + verify(dataSourceMetadataStorage, times(1)).getDataSourceMetadata(); + } + + @Test + void testGetDataSourceMetadataSetWithDefaultDatasource() { when(dataSourceMetadataStorage.getDataSourceMetadata()).thenReturn(new ArrayList<>() { { add(metadata("testDS", DataSourceType.PROMETHEUS, Collections.emptyList(), @@ -220,7 +247,7 @@ void testGetDataSourceMetadataSet() { } }); Set dataSourceMetadataSet - = dataSourceService.getDataSourceMetadataSet(); + = dataSourceService.getDataSourceMetadata(true); assertEquals(2, dataSourceMetadataSet.size()); assertTrue(dataSourceMetadataSet .contains(DataSourceMetadata.defaultOpenSearchDataSourceMetadata())); @@ -228,17 +255,42 @@ void testGetDataSourceMetadataSet() { } @Test - void testUpdateDatasource() { - assertThrows( - UnsupportedOperationException.class, - () -> dataSourceService.updateDataSource(new DataSourceMetadata())); + void testUpdateDataSourceSuccessCase() { + + DataSourceMetadata dataSourceMetadata = metadata("testDS", DataSourceType.OPENSEARCH, + Collections.emptyList(), ImmutableMap.of()); + dataSourceService.updateDataSource(dataSourceMetadata); + verify(dataSourceMetadataStorage, times(1)) + .updateDataSourceMetadata(dataSourceMetadata); + verify(dataSourceFactory, times(1)) + .createDataSource(dataSourceMetadata); + } + + @Test + void testUpdateDefaultDataSource() { + DataSourceMetadata dataSourceMetadata = metadata(DEFAULT_DATASOURCE_NAME, + DataSourceType.OPENSEARCH, Collections.emptyList(), ImmutableMap.of()); + UnsupportedOperationException unsupportedOperationException + = assertThrows(UnsupportedOperationException.class, + () -> dataSourceService.updateDataSource(dataSourceMetadata)); + assertEquals("Not allowed to update default datasource :" + DEFAULT_DATASOURCE_NAME, + unsupportedOperationException.getMessage()); } @Test void testDeleteDatasource() { - assertThrows( - UnsupportedOperationException.class, - () -> dataSourceService.deleteDataSource(NAME)); + dataSourceService.deleteDataSource("testDS"); + verify(dataSourceMetadataStorage, times(1)) + .deleteDataSourceMetadata("testDS"); + } + + @Test + void testDeleteDefaultDatasource() { + UnsupportedOperationException unsupportedOperationException + = assertThrows(UnsupportedOperationException.class, + () -> dataSourceService.deleteDataSource(DEFAULT_DATASOURCE_NAME)); + assertEquals("Not allowed to delete default datasource :" + DEFAULT_DATASOURCE_NAME, + unsupportedOperationException.getMessage()); } @Test @@ -268,4 +320,56 @@ DataSourceMetadata metadata(String name, DataSourceType type, dataSourceMetadata.setProperties(properties); return dataSourceMetadata; } + + @Test + void testRemovalOfAuthorizationInfo() { + HashMap properties = new HashMap<>(); + properties.put("prometheus.uri", "https://localhost:9090"); + properties.put("prometheus.auth.type", "basicauth"); + properties.put("prometheus.auth.username", "username"); + properties.put("prometheus.auth.password", "password"); + DataSourceMetadata dataSourceMetadata = + new DataSourceMetadata("testDS", DataSourceType.PROMETHEUS, + Collections.singletonList("prometheus_access"), properties); + when(dataSourceMetadataStorage.getDataSourceMetadata("testDS")) + .thenReturn(Optional.of(dataSourceMetadata)); + + DataSourceMetadata dataSourceMetadata1 + = dataSourceService.getDataSourceMetadata("testDS"); + assertEquals("testDS", dataSourceMetadata1.getName()); + assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata1.getConnector()); + assertFalse(dataSourceMetadata1.getProperties().containsKey("prometheus.auth.type")); + assertFalse(dataSourceMetadata1.getProperties().containsKey("prometheus.auth.username")); + assertFalse(dataSourceMetadata1.getProperties().containsKey("prometheus.auth.password")); + } + + @Test + void testGetDataSourceMetadataForNonExistingDataSource() { + when(dataSourceMetadataStorage.getDataSourceMetadata("testDS")) + .thenReturn(Optional.empty()); + IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, + () -> dataSourceService.getDataSourceMetadata("testDS")); + assertEquals("DataSource with name: testDS doesn't exist.", exception.getMessage()); + } + + @Test + void testGetDataSourceMetadataForSpecificDataSourceName() { + HashMap properties = new HashMap<>(); + properties.put("prometheus.uri", "http://localhost:9200"); + properties.put("prometheus.auth.type", "basicauth"); + properties.put("prometheus.auth.username", "username"); + properties.put("prometheus.auth.password", "password"); + when(dataSourceMetadataStorage.getDataSourceMetadata("testDS")) + .thenReturn(Optional.ofNullable( + metadata("testDS", DataSourceType.PROMETHEUS, Collections.emptyList(), + properties))); + DataSourceMetadata dataSourceMetadata + = this.dataSourceService.getDataSourceMetadata("testDS"); + assertTrue(dataSourceMetadata.getProperties().containsKey("prometheus.uri")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.type")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.username")); + assertFalse(dataSourceMetadata.getProperties().containsKey("prometheus.auth.password")); + verify(dataSourceMetadataStorage, times(1)).getDataSourceMetadata("testDS"); + } + } diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorageTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorageTest.java new file mode 100644 index 00000000000..5a9efaba673 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/storage/OpenSearchDataSourceMetadataStorageTest.java @@ -0,0 +1,670 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.datasources.storage; + +import static org.opensearch.sql.datasources.storage.OpenSearchDataSourceMetadataStorage.DATASOURCE_INDEX_NAME; + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.databind.ObjectMapper; +import java.util.Collections; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.Optional; +import lombok.SneakyThrows; +import org.apache.lucene.search.TotalHits; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Answers; +import org.mockito.ArgumentMatchers; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.ActionFuture; +import org.opensearch.action.DocWriteResponse; +import org.opensearch.action.admin.indices.create.CreateIndexResponse; +import org.opensearch.action.delete.DeleteResponse; +import org.opensearch.action.index.IndexResponse; +import org.opensearch.action.search.SearchResponse; +import org.opensearch.action.update.UpdateResponse; +import org.opensearch.client.Client; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.index.engine.DocumentMissingException; +import org.opensearch.index.engine.VersionConflictEngineException; +import org.opensearch.index.shard.ShardId; +import org.opensearch.rest.RestStatus; +import org.opensearch.search.SearchHit; +import org.opensearch.search.SearchHits; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.encryptor.Encryptor; +import org.opensearch.sql.datasources.exceptions.DataSourceNotFoundException; + +@ExtendWith(MockitoExtension.class) +public class OpenSearchDataSourceMetadataStorageTest { + + private static final String TEST_DATASOURCE_INDEX_NAME = "testDS"; + + @Mock(answer = Answers.RETURNS_DEEP_STUBS) + private Client client; + @Mock(answer = Answers.RETURNS_DEEP_STUBS) + private ClusterService clusterService; + @Mock + private Encryptor encryptor; + @Mock(answer = Answers.RETURNS_DEEP_STUBS) + private SearchResponse searchResponse; + @Mock + private ActionFuture searchResponseActionFuture; + @Mock + private ActionFuture createIndexResponseActionFuture; + @Mock + private ActionFuture indexResponseActionFuture; + @Mock + private IndexResponse indexResponse; + @Mock + private ActionFuture updateResponseActionFuture; + @Mock + private UpdateResponse updateResponse; + @Mock + private ActionFuture deleteResponseActionFuture; + @Mock + private DeleteResponse deleteResponse; + @Mock + private SearchHit searchHit; + @InjectMocks + private OpenSearchDataSourceMetadataStorage openSearchDataSourceMetadataStorage; + + + @SneakyThrows + @Test + public void testGetDataSourceMetadata() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.OK); + Mockito.when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit}, + new TotalHits(21, TotalHits.Relation.EQUAL_TO), + 1.0F)); + Mockito.when(searchHit.getSourceAsString()) + .thenReturn(getBasicDataSourceMetadataString()); + Mockito.when(encryptor.decrypt("password")).thenReturn("password"); + Mockito.when(encryptor.decrypt("username")).thenReturn("username"); + + Optional dataSourceMetadataOptional + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); + + + Assertions.assertFalse(dataSourceMetadataOptional.isEmpty()); + DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); + Assertions.assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); + Assertions.assertEquals("password", + dataSourceMetadata.getProperties().get("prometheus.auth.password")); + Assertions.assertEquals("username", + dataSourceMetadata.getProperties().get("prometheus.auth.username")); + Assertions.assertEquals("basicauth", + dataSourceMetadata.getProperties().get("prometheus.auth.type")); + } + + @SneakyThrows + @Test + public void testGetDataSourceMetadataWith404SearchResponse() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.NOT_FOUND); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> openSearchDataSourceMetadataStorage.getDataSourceMetadata( + TEST_DATASOURCE_INDEX_NAME)); + Assertions.assertEquals( + "Fetching dataSource metadata information failed with status : NOT_FOUND", + runtimeException.getMessage()); + } + + @SneakyThrows + @Test + public void testGetDataSourceMetadataWithParsingFailed() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.OK); + Mockito.when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit}, + new TotalHits(21, TotalHits.Relation.EQUAL_TO), + 1.0F)); + Mockito.when(searchHit.getSourceAsString()) + .thenReturn("..testDs"); + + Assertions.assertThrows(RuntimeException.class, + () -> openSearchDataSourceMetadataStorage.getDataSourceMetadata( + TEST_DATASOURCE_INDEX_NAME)); + } + + @SneakyThrows + @Test + public void testGetDataSourceMetadataWithAWSSigV4() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.OK); + Mockito.when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit}, + new TotalHits(21, TotalHits.Relation.EQUAL_TO), + 1.0F)); + Mockito.when(searchHit.getSourceAsString()) + .thenReturn(getAWSSigv4DataSourceMetadataString()); + Mockito.when(encryptor.decrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.decrypt("access_key")).thenReturn("access_key"); + + Optional dataSourceMetadataOptional + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); + + + Assertions.assertFalse(dataSourceMetadataOptional.isEmpty()); + DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); + Assertions.assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); + Assertions.assertEquals("secret_key", + dataSourceMetadata.getProperties().get("prometheus.auth.secret_key")); + Assertions.assertEquals("access_key", + dataSourceMetadata.getProperties().get("prometheus.auth.access_key")); + Assertions.assertEquals("awssigv4", + dataSourceMetadata.getProperties().get("prometheus.auth.type")); + } + + @SneakyThrows + @Test + public void testGetDataSourceMetadataWithBasicAuth() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.OK); + Mockito.when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit}, + new TotalHits(21, TotalHits.Relation.EQUAL_TO), + 1.0F)); + Mockito.when(searchHit.getSourceAsString()) + .thenReturn(getDataSourceMetadataStringWithBasicAuthentication()); + Mockito.when(encryptor.decrypt("username")).thenReturn("username"); + Mockito.when(encryptor.decrypt("password")).thenReturn("password"); + + Optional dataSourceMetadataOptional + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); + + + Assertions.assertFalse(dataSourceMetadataOptional.isEmpty()); + DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); + Assertions.assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); + Assertions.assertEquals("username", + dataSourceMetadata.getProperties().get("prometheus.auth.username")); + Assertions.assertEquals("password", + dataSourceMetadata.getProperties().get("prometheus.auth.password")); + Assertions.assertEquals("basicauth", + dataSourceMetadata.getProperties().get("prometheus.auth.type")); + } + + + @SneakyThrows + @Test + public void testGetDataSourceMetadataList() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(true); + Mockito.when(client.search(ArgumentMatchers.any())).thenReturn(searchResponseActionFuture); + Mockito.when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); + Mockito.when(searchResponse.status()).thenReturn(RestStatus.OK); + Mockito.when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit}, + new TotalHits(21, TotalHits.Relation.EQUAL_TO), + 1.0F)); + Mockito.when(searchHit.getSourceAsString()) + .thenReturn(getDataSourceMetadataStringWithNoAuthentication()); + + List dataSourceMetadataList + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(); + + + Assertions.assertEquals(1, dataSourceMetadataList.size()); + DataSourceMetadata dataSourceMetadata = dataSourceMetadataList.get(0); + Assertions.assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); + } + + + @SneakyThrows + @Test + public void testGetDataSourceMetadataListWithNoIndex() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())).thenReturn(indexResponseActionFuture); + + List dataSourceMetadataList + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(); + + Assertions.assertEquals(0, dataSourceMetadataList.size()); + } + + @SneakyThrows + @Test + public void testGetDataSourceMetadataWithNoIndex() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())).thenReturn(indexResponseActionFuture); + + Optional dataSourceMetadataOptional + = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); + + Assertions.assertFalse(dataSourceMetadataOptional.isPresent()); + } + + @Test + public void testCreateDataSourceMetadata() { + + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())).thenReturn(indexResponseActionFuture); + Mockito.when(indexResponseActionFuture.actionGet()).thenReturn(indexResponse); + Mockito.when(indexResponse.getResult()).thenReturn(DocWriteResponse.Result.CREATED); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + this.openSearchDataSourceMetadataStorage.createDataSourceMetadata(dataSourceMetadata); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(1)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).index(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(2)).stashContext(); + + + } + + @Test + public void testCreateDataSourceMetadataWithOutCreatingIndex() { + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.TRUE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.index(ArgumentMatchers.any())).thenReturn(indexResponseActionFuture); + Mockito.when(indexResponseActionFuture.actionGet()).thenReturn(indexResponse); + Mockito.when(indexResponse.getResult()).thenReturn(DocWriteResponse.Result.CREATED); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + this.openSearchDataSourceMetadataStorage.createDataSourceMetadata(dataSourceMetadata); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).index(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + } + + + @Test + public void testCreateDataSourceMetadataFailedWithNotFoundResponse() { + + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())).thenReturn(indexResponseActionFuture); + Mockito.when(indexResponseActionFuture.actionGet()).thenReturn(indexResponse); + Mockito.when(indexResponse.getResult()).thenReturn(DocWriteResponse.Result.NOT_FOUND); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.createDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("Saving dataSource metadata information failed with result : not_found", + runtimeException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(1)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).index(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(2)).stashContext(); + + + } + + @Test + public void testCreateDataSourceMetadataWithVersionConflict() { + + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())) + .thenThrow(VersionConflictEngineException.class); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + IllegalArgumentException illegalArgumentException = + Assertions.assertThrows(IllegalArgumentException.class, + () -> this.openSearchDataSourceMetadataStorage.createDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("A datasource already exists with name: testDS", + illegalArgumentException.getMessage()); + + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(1)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).index(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(2)).stashContext(); + + + } + + @Test + public void testCreateDataSourceMetadataWithException() { + + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); + Mockito.when(client.index(ArgumentMatchers.any())) + .thenThrow(new RuntimeException("error while indexing")); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.createDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("java.lang.RuntimeException: error while indexing", + runtimeException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(1)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).index(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(2)).stashContext(); + + + } + + @Test + public void testCreateDataSourceMetadataWithIndexCreationFailed() { + + Mockito.when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) + .thenReturn(Boolean.FALSE); + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.admin().indices().create(ArgumentMatchers.any())) + .thenReturn(createIndexResponseActionFuture); + Mockito.when(createIndexResponseActionFuture.actionGet()) + .thenReturn(new CreateIndexResponse(false, false, DATASOURCE_INDEX_NAME)); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.createDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals( + "Internal server error while creating.ql-datasources index:: " + + "Index creation is not acknowledged.", + runtimeException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(1)).create(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + } + + @Test + public void testUpdateDataSourceMetadata() { + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.update(ArgumentMatchers.any())).thenReturn(updateResponseActionFuture); + Mockito.when(updateResponseActionFuture.actionGet()).thenReturn(updateResponse); + Mockito.when(updateResponse.getResult()).thenReturn(DocWriteResponse.Result.UPDATED); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + this.openSearchDataSourceMetadataStorage.updateDataSourceMetadata(dataSourceMetadata); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).update(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + + } + + @Test + public void testUpdateDataSourceMetadataWithNotFoundResult() { + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.update(ArgumentMatchers.any())).thenReturn(updateResponseActionFuture); + Mockito.when(updateResponseActionFuture.actionGet()).thenReturn(updateResponse); + Mockito.when(updateResponse.getResult()).thenReturn(DocWriteResponse.Result.NOT_FOUND); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.updateDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("Saving dataSource metadata information failed with result : not_found", + runtimeException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).update(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + + } + + @Test + public void testUpdateDataSourceMetadataWithDocumentMissingException() { + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.update(ArgumentMatchers.any())).thenThrow(new DocumentMissingException( + ShardId.fromString("[2][2]"), "testDS")); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + + + DataSourceNotFoundException dataSourceNotFoundException = + Assertions.assertThrows(DataSourceNotFoundException.class, + () -> this.openSearchDataSourceMetadataStorage.updateDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("Datasource with name: testDS doesn't exist", + dataSourceNotFoundException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).update(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + + } + + @Test + public void testUpdateDataSourceMetadataWithRuntimeException() { + Mockito.when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); + Mockito.when(encryptor.encrypt("access_key")).thenReturn("access_key"); + Mockito.when(client.update(ArgumentMatchers.any())) + .thenThrow(new RuntimeException("error message")); + DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.updateDataSourceMetadata( + dataSourceMetadata)); + Assertions.assertEquals("java.lang.RuntimeException: error message", + runtimeException.getMessage()); + + Mockito.verify(encryptor, Mockito.times(1)).encrypt("secret_key"); + Mockito.verify(encryptor, Mockito.times(1)).encrypt("access_key"); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).update(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + + } + + @Test + public void testDeleteDataSourceMetadata() { + Mockito.when(client.delete(ArgumentMatchers.any())).thenReturn(deleteResponseActionFuture); + Mockito.when(deleteResponseActionFuture.actionGet()).thenReturn(deleteResponse); + Mockito.when(deleteResponse.getResult()).thenReturn(DocWriteResponse.Result.DELETED); + + this.openSearchDataSourceMetadataStorage.deleteDataSourceMetadata("testDS"); + + Mockito.verifyNoInteractions(encryptor); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).delete(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + } + + @Test + public void testDeleteDataSourceMetadataWhichisAlreadyDeleted() { + Mockito.when(client.delete(ArgumentMatchers.any())).thenReturn(deleteResponseActionFuture); + Mockito.when(deleteResponseActionFuture.actionGet()).thenReturn(deleteResponse); + Mockito.when(deleteResponse.getResult()).thenReturn(DocWriteResponse.Result.NOT_FOUND); + + DataSourceNotFoundException dataSourceNotFoundException = + Assertions.assertThrows(DataSourceNotFoundException.class, + () -> this.openSearchDataSourceMetadataStorage.deleteDataSourceMetadata("testDS")); + Assertions.assertEquals("Datasource with name: testDS doesn't exist", + dataSourceNotFoundException.getMessage()); + + + Mockito.verifyNoInteractions(encryptor); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).delete(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + } + + @Test + public void testDeleteDataSourceMetadataWithUnexpectedResult() { + Mockito.when(client.delete(ArgumentMatchers.any())).thenReturn(deleteResponseActionFuture); + Mockito.when(deleteResponseActionFuture.actionGet()).thenReturn(deleteResponse); + Mockito.when(deleteResponse.getResult()).thenReturn(DocWriteResponse.Result.NOOP); + + RuntimeException runtimeException = Assertions.assertThrows(RuntimeException.class, + () -> this.openSearchDataSourceMetadataStorage.deleteDataSourceMetadata("testDS")); + Assertions.assertEquals("Deleting dataSource metadata information failed with result : noop", + runtimeException.getMessage()); + + Mockito.verifyNoInteractions(encryptor); + Mockito.verify(client.admin().indices(), Mockito.times(0)).create(ArgumentMatchers.any()); + Mockito.verify(client, Mockito.times(1)).delete(ArgumentMatchers.any()); + Mockito.verify(client.threadPool().getThreadContext(), Mockito.times(1)).stashContext(); + } + + private String getBasicDataSourceMetadataString() throws JsonProcessingException { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); + Map properties = new HashMap<>(); + properties.put("prometheus.auth.type", "basicauth"); + properties.put("prometheus.auth.username", "username"); + properties.put("prometheus.auth.uri", "https://localhost:9090"); + properties.put("prometheus.auth.password", "password"); + dataSourceMetadata.setProperties(properties); + ObjectMapper objectMapper = new ObjectMapper(); + return objectMapper.writeValueAsString(dataSourceMetadata); + } + + private String getAWSSigv4DataSourceMetadataString() throws JsonProcessingException { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); + Map properties = new HashMap<>(); + properties.put("prometheus.auth.type", "awssigv4"); + properties.put("prometheus.auth.secret_key", "secret_key"); + properties.put("prometheus.auth.uri", "https://localhost:9090"); + properties.put("prometheus.auth.access_key", "access_key"); + dataSourceMetadata.setProperties(properties); + ObjectMapper objectMapper = new ObjectMapper(); + return objectMapper.writeValueAsString(dataSourceMetadata); + } + + private String getDataSourceMetadataStringWithBasicAuthentication() + throws JsonProcessingException { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); + Map properties = new HashMap<>(); + properties.put("prometheus.auth.uri", "https://localhost:9090"); + properties.put("prometheus.auth.type", "basicauth"); + properties.put("prometheus.auth.username", "username"); + properties.put("prometheus.auth.password", "password"); + dataSourceMetadata.setProperties(properties); + ObjectMapper objectMapper = new ObjectMapper(); + return objectMapper.writeValueAsString(dataSourceMetadata); + } + + private String getDataSourceMetadataStringWithNoAuthentication() throws JsonProcessingException { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); + Map properties = new HashMap<>(); + properties.put("prometheus.auth.uri", "https://localhost:9090"); + dataSourceMetadata.setProperties(properties); + ObjectMapper objectMapper = new ObjectMapper(); + return objectMapper.writeValueAsString(dataSourceMetadata); + } + + private DataSourceMetadata getDataSourceMetadata() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); + Map properties = new HashMap<>(); + properties.put("prometheus.auth.type", "awssigv4"); + properties.put("prometheus.auth.secret_key", "secret_key"); + properties.put("prometheus.auth.uri", "https://localhost:9090"); + properties.put("prometheus.auth.access_key", "access_key"); + dataSourceMetadata.setProperties(properties); + return dataSourceMetadata; + } + +} diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceActionTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceActionTest.java new file mode 100644 index 00000000000..3dd5c212143 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportCreateDataSourceActionTest.java @@ -0,0 +1,86 @@ +package org.opensearch.sql.datasources.transport; + +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; + +import java.util.HashSet; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Captor; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.ActionListener; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +@ExtendWith(MockitoExtension.class) +public class TransportCreateDataSourceActionTest { + + @Mock + private TransportService transportService; + @Mock + private TransportCreateDataSourceAction action; + @Mock + private DataSourceServiceImpl dataSourceService; + @Mock + private Task task; + @Mock + private ActionListener actionListener; + @Captor + private ArgumentCaptor + createDataSourceActionResponseArgumentCaptor; + + @Captor + private ArgumentCaptor exceptionArgumentCaptor; + + @BeforeEach + public void setUp() { + action = new TransportCreateDataSourceAction(transportService, + new ActionFilters(new HashSet<>()), dataSourceService); + } + + @Test + public void testDoExecute() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + CreateDataSourceActionRequest request = new CreateDataSourceActionRequest(dataSourceMetadata); + + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).createDataSource(dataSourceMetadata); + Mockito.verify(actionListener) + .onResponse(createDataSourceActionResponseArgumentCaptor.capture()); + CreateDataSourceActionResponse createDataSourceActionResponse + = createDataSourceActionResponseArgumentCaptor.getValue(); + Assertions.assertEquals("Created DataSource with name test_datasource", + createDataSourceActionResponse.getResult()); + } + + @Test + public void testDoExecuteWithException() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + doThrow(new RuntimeException("Error")).when(dataSourceService) + .createDataSource(dataSourceMetadata); + CreateDataSourceActionRequest request = new CreateDataSourceActionRequest(dataSourceMetadata); + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).createDataSource(dataSourceMetadata); + Mockito.verify(actionListener).onFailure(exceptionArgumentCaptor.capture()); + Exception exception = exceptionArgumentCaptor.getValue(); + Assertions.assertTrue(exception instanceof RuntimeException); + Assertions.assertEquals("Error", + exception.getMessage()); + } +} \ No newline at end of file diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceActionTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceActionTest.java new file mode 100644 index 00000000000..9beeb1a9a92 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportDeleteDataSourceActionTest.java @@ -0,0 +1,78 @@ +package org.opensearch.sql.datasources.transport; + +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; + +import java.util.HashSet; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Captor; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.ActionListener; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +@ExtendWith(MockitoExtension.class) +public class TransportDeleteDataSourceActionTest { + + @Mock + private TransportService transportService; + @Mock + private TransportDeleteDataSourceAction action; + @Mock + private DataSourceServiceImpl dataSourceService; + @Mock + private Task task; + @Mock + private ActionListener actionListener; + + @Captor + private ArgumentCaptor + deleteDataSourceActionResponseArgumentCaptor; + @Captor + private ArgumentCaptor exceptionArgumentCaptor; + + + @BeforeEach + public void setUp() { + action = new TransportDeleteDataSourceAction(transportService, + new ActionFilters(new HashSet<>()), dataSourceService); + } + + @Test + public void testDoExecute() { + DeleteDataSourceActionRequest request = new DeleteDataSourceActionRequest("test_datasource"); + + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).deleteDataSource("test_datasource"); + Mockito.verify(actionListener) + .onResponse(deleteDataSourceActionResponseArgumentCaptor.capture()); + DeleteDataSourceActionResponse deleteDataSourceActionResponse + = deleteDataSourceActionResponseArgumentCaptor.getValue(); + Assertions.assertEquals("Deleted DataSource with name test_datasource", + deleteDataSourceActionResponse.getResult()); + } + + @Test + public void testDoExecuteWithException() { + doThrow(new RuntimeException("Error")).when(dataSourceService).deleteDataSource("testDS"); + DeleteDataSourceActionRequest request = new DeleteDataSourceActionRequest("testDS"); + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).deleteDataSource("testDS"); + Mockito.verify(actionListener).onFailure(exceptionArgumentCaptor.capture()); + Exception exception = exceptionArgumentCaptor.getValue(); + Assertions.assertTrue(exception instanceof RuntimeException); + Assertions.assertEquals("Error", + exception.getMessage()); + } +} \ No newline at end of file diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceActionTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceActionTest.java new file mode 100644 index 00000000000..d5506c0a458 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportGetDataSourceActionTest.java @@ -0,0 +1,137 @@ +package org.opensearch.sql.datasources.transport; + +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +import com.google.gson.Gson; +import com.google.gson.reflect.TypeToken; +import java.lang.reflect.Type; +import java.util.Collections; +import java.util.HashSet; +import java.util.Set; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Captor; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.ActionListener; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.sql.protocol.response.format.JsonResponseFormatter; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +@ExtendWith(MockitoExtension.class) +public class TransportGetDataSourceActionTest { + + @Mock + private TransportService transportService; + @Mock + private TransportGetDataSourceAction action; + @Mock + private DataSourceServiceImpl dataSourceService; + @Mock + private Task task; + @Mock + private ActionListener actionListener; + + @Captor + private ArgumentCaptor getDataSourceActionResponseArgumentCaptor; + + @Captor + private ArgumentCaptor exceptionArgumentCaptor; + + @BeforeEach + public void setUp() { + action = new TransportGetDataSourceAction(transportService, + new ActionFilters(new HashSet<>()), dataSourceService); + } + + @Test + public void testDoExecute() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + GetDataSourceActionRequest request = new GetDataSourceActionRequest("test_datasource"); + when(dataSourceService.getDataSourceMetadata("test_datasource")) + .thenReturn(dataSourceMetadata); + + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).getDataSourceMetadata("test_datasource"); + Mockito.verify(actionListener).onResponse(getDataSourceActionResponseArgumentCaptor.capture()); + GetDataSourceActionResponse getDataSourceActionResponse + = getDataSourceActionResponseArgumentCaptor.getValue(); + JsonResponseFormatter dataSourceMetadataJsonResponseFormatter = + new JsonResponseFormatter<>( + JsonResponseFormatter.Style.PRETTY) { + @Override + protected Object buildJsonObject(DataSourceMetadata response) { + return response; + } + }; + Assertions.assertEquals(dataSourceMetadataJsonResponseFormatter.format(dataSourceMetadata), + getDataSourceActionResponse.getResult()); + DataSourceMetadata result = + new Gson().fromJson(getDataSourceActionResponse.getResult(), DataSourceMetadata.class); + Assertions.assertEquals("test_datasource", result.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, result.getConnector()); + } + + @Test + public void testDoExecuteForGetAllDataSources() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + + GetDataSourceActionRequest request = new GetDataSourceActionRequest(); + when(dataSourceService.getDataSourceMetadata(false)) + .thenReturn(Collections.singleton(dataSourceMetadata)); + + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).getDataSourceMetadata(false); + Mockito.verify(actionListener).onResponse(getDataSourceActionResponseArgumentCaptor.capture()); + GetDataSourceActionResponse getDataSourceActionResponse + = getDataSourceActionResponseArgumentCaptor.getValue(); + JsonResponseFormatter> dataSourceMetadataJsonResponseFormatter = + new JsonResponseFormatter<>( + JsonResponseFormatter.Style.PRETTY) { + @Override + protected Object buildJsonObject(Set response) { + return response; + } + }; + Type setType = new TypeToken>() { + }.getType(); + Assertions.assertEquals( + dataSourceMetadataJsonResponseFormatter.format(Collections.singleton(dataSourceMetadata)), + getDataSourceActionResponse.getResult()); + Set result = + new Gson().fromJson(getDataSourceActionResponse.getResult(), setType); + DataSourceMetadata resultDataSource = result.iterator().next(); + Assertions.assertEquals("test_datasource", resultDataSource.getName()); + Assertions.assertEquals(DataSourceType.PROMETHEUS, resultDataSource.getConnector()); + } + + @Test + public void testDoExecuteWithException() { + doThrow(new RuntimeException("Error")).when(dataSourceService).getDataSourceMetadata("testDS"); + GetDataSourceActionRequest request = new GetDataSourceActionRequest("testDS"); + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).getDataSourceMetadata("testDS"); + Mockito.verify(actionListener).onFailure(exceptionArgumentCaptor.capture()); + Exception exception = exceptionArgumentCaptor.getValue(); + Assertions.assertTrue(exception instanceof RuntimeException); + Assertions.assertEquals("Error", + exception.getMessage()); + } +} \ No newline at end of file diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceActionTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceActionTest.java new file mode 100644 index 00000000000..fecab012d26 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/transport/TransportUpdateDataSourceActionTest.java @@ -0,0 +1,87 @@ +package org.opensearch.sql.datasources.transport; + +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; + +import java.util.HashSet; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Captor; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.action.ActionListener; +import org.opensearch.action.support.ActionFilters; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionRequest; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionResponse; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.tasks.Task; +import org.opensearch.transport.TransportService; + +@ExtendWith(MockitoExtension.class) +public class TransportUpdateDataSourceActionTest { + + @Mock + private TransportService transportService; + @Mock + private TransportUpdateDataSourceAction action; + @Mock + private DataSourceServiceImpl dataSourceService; + @Mock + private Task task; + @Mock + private ActionListener actionListener; + + @Captor + private ArgumentCaptor + updateDataSourceActionResponseArgumentCaptor; + + @Captor + private ArgumentCaptor exceptionArgumentCaptor; + + @BeforeEach + public void setUp() { + action = new TransportUpdateDataSourceAction(transportService, + new ActionFilters(new HashSet<>()), dataSourceService); + } + + @Test + public void testDoExecute() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + UpdateDataSourceActionRequest request = new UpdateDataSourceActionRequest(dataSourceMetadata); + + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).updateDataSource(dataSourceMetadata); + Mockito.verify(actionListener) + .onResponse(updateDataSourceActionResponseArgumentCaptor.capture()); + UpdateDataSourceActionResponse updateDataSourceActionResponse + = updateDataSourceActionResponseArgumentCaptor.getValue(); + Assertions.assertEquals("Updated DataSource with name test_datasource", + updateDataSourceActionResponse.getResult()); + } + + @Test + public void testDoExecuteWithException() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("test_datasource"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + doThrow(new RuntimeException("Error")).when(dataSourceService) + .updateDataSource(dataSourceMetadata); + UpdateDataSourceActionRequest request = new UpdateDataSourceActionRequest(dataSourceMetadata); + action.doExecute(task, request, actionListener); + verify(dataSourceService, times(1)).updateDataSource(dataSourceMetadata); + Mockito.verify(actionListener).onFailure(exceptionArgumentCaptor.capture()); + Exception exception = exceptionArgumentCaptor.getValue(); + Assertions.assertTrue(exception instanceof RuntimeException); + Assertions.assertEquals("Error", + exception.getMessage()); + } +} \ No newline at end of file diff --git a/plugin/src/test/java/org/opensearch/sql/plugin/utils/SchedulerTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/utils/SchedulerTest.java similarity index 56% rename from plugin/src/test/java/org/opensearch/sql/plugin/utils/SchedulerTest.java rename to datasources/src/test/java/org/opensearch/sql/datasources/utils/SchedulerTest.java index c86f3341b1e..d091e77044b 100644 --- a/plugin/src/test/java/org/opensearch/sql/plugin/utils/SchedulerTest.java +++ b/datasources/src/test/java/org/opensearch/sql/datasources/utils/SchedulerTest.java @@ -3,22 +3,20 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.sql.plugin.utils; - -import static org.junit.Assert.assertTrue; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.doAnswer; -import static org.mockito.Mockito.when; +package org.opensearch.sql.datasources.utils; import java.util.concurrent.atomic.AtomicBoolean; -import org.junit.Test; -import org.junit.runner.RunWith; +import org.junit.Assert; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentMatchers; import org.mockito.Mock; -import org.mockito.junit.MockitoJUnitRunner; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.client.node.NodeClient; import org.opensearch.threadpool.ThreadPool; -@RunWith(MockitoJUnitRunner.class) +@ExtendWith(MockitoExtension.class) public class SchedulerTest { @Mock @@ -29,19 +27,19 @@ public class SchedulerTest { @Test public void testSchedule() { - when(nodeClient.threadPool()).thenReturn(threadPool); + Mockito.when(nodeClient.threadPool()).thenReturn(threadPool); - doAnswer( + Mockito.doAnswer( invocation -> { Runnable task = invocation.getArgument(0); task.run(); return null; }) .when(threadPool) - .schedule(any(), any(), any()); + .schedule(ArgumentMatchers.any(), ArgumentMatchers.any(), ArgumentMatchers.any()); AtomicBoolean isRun = new AtomicBoolean(false); Scheduler.schedule(nodeClient, () -> isRun.set(true)); - assertTrue(isRun.get()); + Assert.assertTrue(isRun.get()); } } \ No newline at end of file diff --git a/datasources/src/test/java/org/opensearch/sql/datasources/utils/XContentParserUtilsTest.java b/datasources/src/test/java/org/opensearch/sql/datasources/utils/XContentParserUtilsTest.java new file mode 100644 index 00000000000..605d641bda6 --- /dev/null +++ b/datasources/src/test/java/org/opensearch/sql/datasources/utils/XContentParserUtilsTest.java @@ -0,0 +1,101 @@ +package org.opensearch.sql.datasources.utils; + +import static org.junit.jupiter.api.Assertions.assertThrows; + +import com.google.gson.Gson; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import lombok.SneakyThrows; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.common.bytes.BytesReference; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.datasource.model.DataSourceType; + +@ExtendWith(MockitoExtension.class) +public class XContentParserUtilsTest { + + @SneakyThrows + @Test + public void testConvertToXContent() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(List.of("prometheus_access")); + dataSourceMetadata.setProperties(Map.of("prometheus.uri", "https://localhost:9090")); + + XContentBuilder contentBuilder = XContentParserUtils.convertToXContent(dataSourceMetadata); + String contentString = BytesReference.bytes(contentBuilder).utf8ToString(); + Assertions.assertEquals("{\"name\":\"testDS\",\"connector\":\"PROMETHEUS\",\"allowedRoles\":[\"prometheus_access\"],\"properties\":{\"prometheus.uri\":\"https://localhost:9090\"}}", + contentString); + } + + @SneakyThrows + @Test + public void testToDataSourceMetadataFromJson() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("testDS"); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(List.of("prometheus_access")); + dataSourceMetadata.setProperties(Map.of("prometheus.uri", "https://localhost:9090")); + Gson gson = new Gson(); + String json = gson.toJson(dataSourceMetadata); + + DataSourceMetadata retrievedMetadata = XContentParserUtils.toDataSourceMetadata(json); + + Assertions.assertEquals(retrievedMetadata, dataSourceMetadata); + Assertions.assertEquals("prometheus_access", retrievedMetadata.getAllowedRoles().get(0)); + + } + + @SneakyThrows + @Test + public void testToDataSourceMetadataFromJsonWithoutName() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); + dataSourceMetadata.setAllowedRoles(List.of("prometheus_access")); + dataSourceMetadata.setProperties(Map.of("prometheus.uri", "https://localhost:9090")); + Gson gson = new Gson(); + String json = gson.toJson(dataSourceMetadata); + + IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> { + XContentParserUtils.toDataSourceMetadata(json); + }); + Assertions.assertEquals("name and connector are required fields.", exception.getMessage()); + } + + @SneakyThrows + @Test + public void testToDataSourceMetadataFromJsonWithoutConnector() { + DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); + dataSourceMetadata.setName("name"); + dataSourceMetadata.setAllowedRoles(List.of("prometheus_access")); + dataSourceMetadata.setProperties(Map.of("prometheus.uri", "https://localhost:9090")); + Gson gson = new Gson(); + String json = gson.toJson(dataSourceMetadata); + + IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> { + XContentParserUtils.toDataSourceMetadata(json); + }); + Assertions.assertEquals("name and connector are required fields.", exception.getMessage()); + } + + @SneakyThrows + @Test + public void testToDataSourceMetadataFromJsonUsingUnknownObject() { + HashMap hashMap = new HashMap<>(); + hashMap.put("test", "test"); + Gson gson = new Gson(); + String json = gson.toJson(hashMap); + + IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> { + XContentParserUtils.toDataSourceMetadata(json); + }); + Assertions.assertEquals("Unknown field: test", exception.getMessage()); + } + +} diff --git a/docs/user/beyond/partiql.rst b/docs/user/beyond/partiql.rst index 6ad93ddeafc..76fec8405de 100644 --- a/docs/user/beyond/partiql.rst +++ b/docs/user/beyond/partiql.rst @@ -202,11 +202,11 @@ Selecting top level for object fields, object fields of array value and nested f os> SELECT city, accounts, projects FROM people; fetched rows / total rows = 1/1 - +-----------------------------------------------------+-----------------------+----------------------------------------------------------------------------------------------------------------+ - | city | accounts | projects | - |-----------------------------------------------------+-----------------------+----------------------------------------------------------------------------------------------------------------| - | {'name': 'Seattle', 'location': {'latitude': 10.5}} | [{'id': 1},{'id': 2}] | [{'name': 'AWS Redshift Spectrum querying'},{'name': 'AWS Redshift security'},{'name': 'AWS Aurora security'}] | - +-----------------------------------------------------+-----------------------+----------------------------------------------------------------------------------------------------------------+ + +-----------------------------------------------------+------------+----------------------------------------------------------------------------------------------------------------+ + | city | accounts | projects | + |-----------------------------------------------------+------------+----------------------------------------------------------------------------------------------------------------| + | {'name': 'Seattle', 'location': {'latitude': 10.5}} | {'id': 1} | [{'name': 'AWS Redshift Spectrum querying'},{'name': 'AWS Redshift security'},{'name': 'AWS Aurora security'}] | + +-----------------------------------------------------+------------+----------------------------------------------------------------------------------------------------------------+ Example 2: Selecting Deeper Levels ---------------------------------- diff --git a/docs/user/dql/basics.rst b/docs/user/dql/basics.rst index 9762f239880..b7e8cf35a4d 100644 --- a/docs/user/dql/basics.rst +++ b/docs/user/dql/basics.rst @@ -155,6 +155,46 @@ Result set: | Nanette| Bates| +---------+--------+ +One can also provide meta-field name(s) to retrieve reserved-fields (beginning with underscore) from OpenSearch documents. Meta-fields are not output +from wildcard calls (`SELECT *`) and must be explicitly included to be returned. + +SQL query:: + + POST /_plugins/_sql + { + "query" : "SELECT firstname, lastname, _id, _index, _sort FROM accounts" + } + +Explain:: + + { + "from" : 0, + "size" : 200, + "_source" : { + "includes" : [ + "firstname", + "_id", + "_index", + "_sort", + "lastname" + ], + "excludes" : [ ] + } + } + + +This produces results like this for example:: + + os> SELECT firstname, lastname, _index, _sort FROM accounts; + fetched rows / total rows = 4/4 + +-------------+------------+----------+---------+ + | firstname | lastname | _index | _sort | + |-------------+------------+----------+---------| + | Amber | Duke | accounts | -2 | + | Hattie | Bond | accounts | -2 | + | Nanette | Bates | accounts | -2 | + | Dale | Adams | accounts | -2 | + +-------------+------------+----------+---------+ Example 3: Using Field Alias ---------------------------- diff --git a/docs/user/dql/functions.rst b/docs/user/dql/functions.rst index 6283bc70e1c..d117226c4d4 100644 --- a/docs/user/dql/functions.rst +++ b/docs/user/dql/functions.rst @@ -4221,6 +4221,48 @@ Another example to show how to set custom values for the optional parameters:: | 1 | The House at Pooh Corner | Alan Alexander Milne | +------+--------------------------+----------------------+ +SCORE +------------ + +Description +>>>>>>>>>>> + +``score(relevance_expression[, boost])`` +``score_query(relevance_expression[, boost])`` +``scorequery(relevance_expression[, boost])`` + +The `SCORE()` function calculates the `_score` of any documents matching the enclosed relevance-based expression. The `SCORE()` +function expects one argument with an optional second argument. The first argument is the relevance-based search expression. +The second argument is an optional floating-point boost to the score (the default value is 1.0). + +The `SCORE()` function sets `track_scores=true` for OpenSearch requests. Without it, `_score` fields may return `null` for some +relevance-based search expressions. + +Please refer to examples below: + +| ``score(query('Tags:taste OR Body:taste', ...), 2.0)`` + +The `score_query` and `scorequery` functions are alternative names for the `score` function. + +Example boosting score:: + + os> select *, _score from books where score(query('title:Pooh House', default_operator='AND'), 2.0); + fetched rows / total rows = 1/1 + +------+--------------------------+----------------------+-----------+ + | id | title | author | _score | + |------+--------------------------+----------------------+-----------| + | 1 | The House at Pooh Corner | Alan Alexander Milne | 1.5884793 | + +------+--------------------------+----------------------+-----------+ + + os> select *, _score from books where score(query('title:Pooh House', default_operator='AND'), 5.0) OR score(query('title:Winnie', default_operator='AND'), 1.5); + fetched rows / total rows = 2/2 + +------+--------------------------+----------------------+-----------+ + | id | title | author | _score | + |------+--------------------------+----------------------+-----------| + | 1 | The House at Pooh Corner | Alan Alexander Milne | 3.9711983 | + | 2 | Winnie-the-Pooh | Alan Alexander Milne | 1.1581701 | + +------+--------------------------+----------------------+-----------+ + HIGHLIGHT ------------ @@ -4299,6 +4341,29 @@ Another example to show how to set custom values for the optional parameters:: | tEsT wIlDcArD sensitive cases | +-------------------------------------------+ +NESTED +------ + +Description +>>>>>>>>>>> + +``nested(field | [field, path])`` + +The ``nested`` function maps to the ``nested`` query used in search engine. It returns nested field types in documents that match the provided specified field(s). +If the user does not provide the ``path`` parameter it will be generated dynamically. For example the ``field`` ``user.office.cubicle`` would dynamically generate the path +``user.office``. + +Example with ``field`` and ``path`` parameters:: + + os> SELECT nested(message.info, message) FROM nested; + fetched rows / total rows = 2/2 + +---------------------------------+ + | nested(message.info, message) | + |---------------------------------| + | a | + | b | + +---------------------------------+ + System Functions ================ diff --git a/docs/user/dql/metadata.rst b/docs/user/dql/metadata.rst index 22a635cf383..a02bcf096a5 100644 --- a/docs/user/dql/metadata.rst +++ b/docs/user/dql/metadata.rst @@ -35,7 +35,7 @@ Example 1: Show All Indices Information SQL query:: os> SHOW TABLES LIKE '%' - fetched rows / total rows = 8/8 + fetched rows / total rows = 9/9 +----------------+---------------+-----------------+--------------+-----------+------------+--------------+-------------+-----------------------------+------------------+ | TABLE_CAT | TABLE_SCHEM | TABLE_NAME | TABLE_TYPE | REMARKS | TYPE_CAT | TYPE_SCHEM | TYPE_NAME | SELF_REFERENCING_COL_NAME | REF_GENERATION | |----------------+---------------+-----------------+--------------+-----------+------------+--------------+-------------+-----------------------------+------------------| @@ -44,6 +44,7 @@ SQL query:: | docTestCluster | null | accounts | BASE TABLE | null | null | null | null | null | null | | docTestCluster | null | apache | BASE TABLE | null | null | null | null | null | null | | docTestCluster | null | books | BASE TABLE | null | null | null | null | null | null | + | docTestCluster | null | nested | BASE TABLE | null | null | null | null | null | null | | docTestCluster | null | nyc_taxi | BASE TABLE | null | null | null | null | null | null | | docTestCluster | null | people | BASE TABLE | null | null | null | null | null | null | | docTestCluster | null | wildcard | BASE TABLE | null | null | null | null | null | null | diff --git a/doctest/test_data/nested_objects.json b/doctest/test_data/nested_objects.json new file mode 100644 index 00000000000..fc5f56b4c52 --- /dev/null +++ b/doctest/test_data/nested_objects.json @@ -0,0 +1,4 @@ +{"index":{"_id":"1"}} +{"message":{"info":"a","author":"e","dayOfWeek":1},"comment":{"data":"ab","likes":3},"myNum":1,"someField":"b"} +{"index":{"_id":"2"}} +{"message":{"info":"b","author":"f","dayOfWeek":2},"comment":{"data":"aa","likes":2},"myNum":2,"someField":"a"} diff --git a/doctest/test_docs.py b/doctest/test_docs.py index c517b2756c3..1fedbdf49e8 100644 --- a/doctest/test_docs.py +++ b/doctest/test_docs.py @@ -27,6 +27,7 @@ BOOKS = "books" APACHE = "apache" WILDCARD = "wildcard" +NESTED = "nested" DATASOURCES = ".ql-datasources" @@ -95,6 +96,7 @@ def set_up_test_indices(test): load_file("books.json", index_name=BOOKS) load_file("apache.json", index_name=APACHE) load_file("wildcard.json", index_name=WILDCARD) + load_file("nested_objects.json", index_name=NESTED) load_file("datasources.json", index_name=DATASOURCES) @@ -124,7 +126,7 @@ def set_up(test): def tear_down(test): # drop leftover tables after each test - test_data_client.indices.delete(index=[ACCOUNTS, EMPLOYEES, PEOPLE, ACCOUNT2, NYC_TAXI, BOOKS, APACHE, WILDCARD], ignore_unavailable=True) + test_data_client.indices.delete(index=[ACCOUNTS, EMPLOYEES, PEOPLE, ACCOUNT2, NYC_TAXI, BOOKS, APACHE, WILDCARD, NESTED], ignore_unavailable=True) docsuite = partial(doctest.DocFileSuite, diff --git a/doctest/test_mapping/nested_objects.json b/doctest/test_mapping/nested_objects.json new file mode 100644 index 00000000000..4f0ed974332 --- /dev/null +++ b/doctest/test_mapping/nested_objects.json @@ -0,0 +1,47 @@ +{ + "mappings": { + "properties": { + "message": { + "type": "nested", + "properties": { + "info": { + "type": "keyword", + "index": "true" + }, + "author": { + "type": "keyword", + "fields": { + "keyword": { + "type": "keyword", + "ignore_above": 256 + } + }, + "index": "true" + }, + "dayOfWeek": { + "type": "long" + } + } + }, + "comment": { + "type": "nested", + "properties": { + "data": { + "type": "keyword", + "index": "true" + }, + "likes": { + "type": "long" + } + } + }, + "myNum": { + "type": "long" + }, + "someField": { + "type": "keyword", + "index": "true" + } + } + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/datasource/DataSourceAPIsIT.java b/integ-test/src/test/java/org/opensearch/sql/datasource/DataSourceAPIsIT.java index e324e976bae..d559207cc1b 100644 --- a/integ-test/src/test/java/org/opensearch/sql/datasource/DataSourceAPIsIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/datasource/DataSourceAPIsIT.java @@ -1,49 +1,192 @@ -package org.opensearch.sql.datasource;/* +/* * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ +package org.opensearch.sql.datasource; +import static org.opensearch.sql.legacy.TestUtils.getResponseBody; + +import com.google.common.collect.ImmutableList; +import com.google.common.collect.ImmutableMap; import com.google.gson.Gson; -import java.io.IOException; +import com.google.gson.JsonObject; +import com.google.gson.reflect.TypeToken; +import java.lang.reflect.Type; import java.util.ArrayList; -import java.util.HashMap; -import java.util.Map; +import java.util.List; +import lombok.SneakyThrows; +import org.apache.commons.lang3.StringUtils; import org.junit.Assert; import org.junit.Test; import org.opensearch.client.Request; import org.opensearch.client.RequestOptions; +import org.opensearch.client.Response; +import org.opensearch.client.ResponseException; import org.opensearch.sql.datasource.model.DataSourceMetadata; import org.opensearch.sql.datasource.model.DataSourceType; import org.opensearch.sql.ppl.PPLIntegTestCase; public class DataSourceAPIsIT extends PPLIntegTestCase { + @SneakyThrows + @Test + public void createDataSourceAPITest() { + //create datasource + DataSourceMetadata createDSM = + new DataSourceMetadata("create_prometheus", DataSourceType.PROMETHEUS, + ImmutableList.of(), ImmutableMap.of("prometheus.uri", "https://localhost:9090")); + Request createRequest = getCreateDataSourceRequest(createDSM); + Response response = client().performRequest(createRequest); + Assert.assertEquals(201, response.getStatusLine().getStatusCode()); + String createResponseString = getResponseBody(response); + Assert.assertEquals("Created DataSource with name create_prometheus", createResponseString); + //Datasource is not immediately created. so introducing a sleep of 2s. + Thread.sleep(2000); + + //get datasource to validate the creation. + Request getRequest = getFetchDataSourceRequest("create_prometheus"); + Response getResponse = client().performRequest(getRequest); + Assert.assertEquals(200, getResponse.getStatusLine().getStatusCode()); + String getResponseString = getResponseBody(getResponse); + DataSourceMetadata dataSourceMetadata = + new Gson().fromJson(getResponseString, DataSourceMetadata.class); + Assert.assertEquals("https://localhost:9090", + dataSourceMetadata.getProperties().get("prometheus.uri")); + } + + + @SneakyThrows + @Test + public void updateDataSourceAPITest() { + //create datasource + DataSourceMetadata createDSM = + new DataSourceMetadata("update_prometheus", DataSourceType.PROMETHEUS, + ImmutableList.of(), ImmutableMap.of("prometheus.uri", "https://localhost:9090")); + Request createRequest = getCreateDataSourceRequest(createDSM); + client().performRequest(createRequest); + //Datasource is not immediately created. so introducing a sleep of 2s. + Thread.sleep(2000); + + //update datasource + DataSourceMetadata updateDSM = + new DataSourceMetadata("update_prometheus", DataSourceType.PROMETHEUS, + ImmutableList.of(), ImmutableMap.of("prometheus.uri", "https://randomtest:9090")); + Request updateRequest = getUpdateDataSourceRequest(updateDSM); + Response updateResponse = client().performRequest(updateRequest); + Assert.assertEquals(200, updateResponse.getStatusLine().getStatusCode()); + String updateResponseString = getResponseBody(updateResponse); + Assert.assertEquals("Updated DataSource with name update_prometheus", updateResponseString); + + //Datasource is not immediately updated. so introducing a sleep of 2s. + Thread.sleep(2000); + + //get datasource to validate the modification. + //get datasource + Request getRequest = getFetchDataSourceRequest("update_prometheus"); + Response getResponse = client().performRequest(getRequest); + Assert.assertEquals(200, getResponse.getStatusLine().getStatusCode()); + String getResponseString = getResponseBody(getResponse); + DataSourceMetadata dataSourceMetadata = + new Gson().fromJson(getResponseString, DataSourceMetadata.class); + Assert.assertEquals("https://randomtest:9090", + dataSourceMetadata.getProperties().get("prometheus.uri")); + } + + + @SneakyThrows + @Test + public void deleteDataSourceTest() { + + //create datasource for deletion + DataSourceMetadata createDSM = + new DataSourceMetadata("delete_prometheus", DataSourceType.PROMETHEUS, + ImmutableList.of(), ImmutableMap.of("prometheus.uri", "https://localhost:9090")); + Request createRequest = getCreateDataSourceRequest(createDSM); + client().performRequest(createRequest); + //Datasource is not immediately created. so introducing a sleep of 2s. + Thread.sleep(2000); + + //delete datasource + Request deleteRequest = getDeleteDataSourceRequest("delete_prometheus"); + Response deleteResponse = client().performRequest(deleteRequest); + Assert.assertEquals(204, deleteResponse.getStatusLine().getStatusCode()); + + //Datasource is not immediately deleted. so introducing a sleep of 2s. + Thread.sleep(2000); + + //get datasources to verify the deletion + final Request prometheusGetRequest = getFetchDataSourceRequest("delete_prometheus"); + ResponseException prometheusGetResponseException + = Assert.assertThrows(ResponseException.class, () -> client().performRequest(prometheusGetRequest)); + Assert.assertEquals( 400, prometheusGetResponseException.getResponse().getStatusLine().getStatusCode()); + String prometheusGetResponseString = getResponseBody(prometheusGetResponseException.getResponse()); + JsonObject errorMessage = new Gson().fromJson(prometheusGetResponseString, JsonObject.class); + Assert.assertEquals("DataSource with name: delete_prometheus doesn't exist.", + errorMessage.get("error").getAsJsonObject().get("details").getAsString()); + + } + + @SneakyThrows @Test - public void createDataSourceTest() throws IOException { - Request request = getCreateDataSourceRequest(getDataSourceMetadataJsonString()); - String response = executeRequest(request); - Assert.assertEquals("Created DataSource with name prometheus1", response); + public void getAllDataSourceTest() { +//create datasource for deletion + DataSourceMetadata createDSM = + new DataSourceMetadata("get_all_prometheus", DataSourceType.PROMETHEUS, + ImmutableList.of(), ImmutableMap.of("prometheus.uri", "https://localhost:9090")); + Request createRequest = getCreateDataSourceRequest(createDSM); + client().performRequest(createRequest); + //Datasource is not immediately created. so introducing a sleep of 2s. + Thread.sleep(2000); + + Request getRequest = getFetchDataSourceRequest(null); + Response getResponse = client().performRequest(getRequest); + Assert.assertEquals(200, getResponse.getStatusLine().getStatusCode()); + String getResponseString = getResponseBody(getResponse); + Type listType = new TypeToken>() {}.getType(); + List dataSourceMetadataList = + new Gson().fromJson(getResponseString, listType); + Assert.assertTrue( + dataSourceMetadataList.stream().anyMatch(ds -> ds.getName().equals("get_all_prometheus"))); } - private Request getCreateDataSourceRequest(String dataSourceMetadataJson) { + + private Request getCreateDataSourceRequest(DataSourceMetadata dataSourceMetadata) { Request request = new Request("POST", "/_plugins/_query/_datasources"); - request.setJsonEntity(dataSourceMetadataJson); + request.setJsonEntity(new Gson().toJson(dataSourceMetadata)); + RequestOptions.Builder restOptionsBuilder = RequestOptions.DEFAULT.toBuilder(); + restOptionsBuilder.addHeader("Content-Type", "application/json"); + request.setOptions(restOptionsBuilder); + return request; + } + + private Request getUpdateDataSourceRequest(DataSourceMetadata dataSourceMetadata) { + Request request = new Request("PUT", "/_plugins/_query/_datasources"); + request.setJsonEntity(new Gson().toJson(dataSourceMetadata)); + RequestOptions.Builder restOptionsBuilder = RequestOptions.DEFAULT.toBuilder(); + restOptionsBuilder.addHeader("Content-Type", "application/json"); + request.setOptions(restOptionsBuilder); + return request; + } + + private Request getFetchDataSourceRequest(String name) { + Request request = new Request("GET", "/_plugins/_query/_datasources" + "/" + name); + if (StringUtils.isEmpty(name)) { + request = new Request("GET", "/_plugins/_query/_datasources"); + } RequestOptions.Builder restOptionsBuilder = RequestOptions.DEFAULT.toBuilder(); restOptionsBuilder.addHeader("Content-Type", "application/json"); request.setOptions(restOptionsBuilder); return request; } - private String getDataSourceMetadataJsonString() { - DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); - dataSourceMetadata.setName("prometheus1"); - dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); - dataSourceMetadata.setAllowedRoles(new ArrayList<>()); - Map propertiesMap = new HashMap<>(); - propertiesMap.put("prometheus.uri", "http://localhost:9200"); - dataSourceMetadata.setProperties(propertiesMap); - return new Gson().toJson(dataSourceMetadata); + + private Request getDeleteDataSourceRequest(String name) { + Request request = new Request("DELETE", "/_plugins/_query/_datasources" + "/" + name); + RequestOptions.Builder restOptionsBuilder = RequestOptions.DEFAULT.toBuilder(); + restOptionsBuilder.addHeader("Content-Type", "application/json"); + request.setOptions(restOptionsBuilder); + return request; } } diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/CsvFormatResponseIT.java b/integ-test/src/test/java/org/opensearch/sql/legacy/CsvFormatResponseIT.java index d562794409c..aa3bf67f58d 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/CsvFormatResponseIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/CsvFormatResponseIT.java @@ -99,7 +99,6 @@ public void specificPercentilesIntAndDouble() throws IOException { } } - @Ignore("only work for legacy engine") public void nestedObjectsAndArraysAreQuoted() throws IOException { final String query = String.format(Locale.ROOT, "SELECT * FROM %s WHERE _id = 5", TEST_INDEX_NESTED_TYPE); @@ -114,7 +113,6 @@ public void nestedObjectsAndArraysAreQuoted() throws IOException { Assert.assertThat(result, containsString(expectedMessage)); } - @Ignore("only work for legacy engine") public void arraysAreQuotedInFlatMode() throws IOException { setFlatOption(true); @@ -521,7 +519,7 @@ private void assertEquals(String expected, String actual, Double delta) { @Test public void includeScore() throws Exception { String query = String.format(Locale.ROOT, - "select age , firstname from %s where age > 31 order by _score desc limit 2 ", + "select age, firstname, _score from %s where age > 31 order by _score desc limit 2 ", TEST_INDEX_ACCOUNT); CSVResult csvResult = executeCsvRequest(query, false, true, false); List headers = csvResult.getHeaders(); @@ -575,10 +573,10 @@ public void twoCharsSeperator() throws Exception { } - @Ignore("only work for legacy engine") + @Ignore("tested in @see: org.opensearch.sql.sql.IdentifierIT.testMetafieldIdentifierTest") public void includeIdAndNotTypeOrScore() throws Exception { String query = String.format(Locale.ROOT, - "select age , firstname from %s where lastname = 'Marquez' ", TEST_INDEX_ACCOUNT); + "select age, firstname, _id from %s where lastname = 'Marquez' ", TEST_INDEX_ACCOUNT); CSVResult csvResult = executeCsvRequest(query, false, false, true); List headers = csvResult.getHeaders(); Assert.assertEquals(3, headers.size()); diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/MethodQueryIT.java b/integ-test/src/test/java/org/opensearch/sql/legacy/MethodQueryIT.java index 680058d8449..027228a92b6 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/MethodQueryIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/MethodQueryIT.java @@ -12,6 +12,7 @@ import java.io.IOException; import java.util.Locale; import org.junit.Assert; +import org.junit.Ignore; import org.junit.Test; /** @@ -111,13 +112,13 @@ public void matchQueryTest() throws IOException { * * @throws IOException */ - // todo @Test + @Ignore("score query no longer maps to constant_score in the V2 engine - @see org.opensearch.sql.sql.ScoreQueryIT") public void scoreQueryTest() throws IOException { final String result = explainQuery(String.format(Locale.ROOT, "select address from %s " + "where score(matchQuery(address, 'Lane'),100) " + - "or score(matchQuery(address,'Street'),0.5) order by _score desc limit 3", + "or score(matchQuery(address,'Street'),0.5) order by _score desc limit 3", TestsConstants.TEST_INDEX_ACCOUNT)); Assert.assertThat(result, both(containsString("{\"constant_score\":" + @@ -176,6 +177,7 @@ public void wildcardQueryTest() throws IOException { * @throws IOException */ @Test + @Ignore("score query no longer handled by legacy engine - @see org.opensearch.sql.sql.ScoreQueryIT") public void matchPhraseQueryTest() throws IOException { final String result = explainQuery(String.format(Locale.ROOT, "select address from %s " + diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/ObjectFieldSelectIT.java b/integ-test/src/test/java/org/opensearch/sql/legacy/ObjectFieldSelectIT.java index bddaa227728..b1db21a2ff5 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/ObjectFieldSelectIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/ObjectFieldSelectIT.java @@ -70,8 +70,7 @@ public void testSelectObjectInnerFields() { public void testSelectNestedFieldItself() { JSONObject response = new JSONObject(query("SELECT projects FROM %s")); - // Nested field is absent in OpenSearch Get Field Mapping response either hence "object" used - verifySchema(response, schema("projects", null, "object")); + verifySchema(response, schema("projects", null, "nested")); // Expect nested field itself is returned in a single cell verifyDataRows(response, diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/PrettyFormatResponseIT.java b/integ-test/src/test/java/org/opensearch/sql/legacy/PrettyFormatResponseIT.java index 226645ce857..200c300f3b5 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/PrettyFormatResponseIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/PrettyFormatResponseIT.java @@ -126,10 +126,11 @@ public void selectWrongField() throws IOException { } @Test + @Ignore("_score tested in V2 engine - @see org.opensearch.sql.sql.ScoreQueryIT") public void selectScore() throws IOException { JSONObject response = executeQuery( - String.format(Locale.ROOT, "SELECT _score FROM %s WHERE balance > 30000", - TestsConstants.TEST_INDEX_ACCOUNT)); + String.format(Locale.ROOT, "SELECT _score FROM %s WHERE SCORE(match_phrase(phrase, 'brown fox'))", + TestsConstants.TEST_INDEX_PHRASE)); List fields = Collections.singletonList("_score"); assertContainsColumns(getSchema(response), fields); @@ -195,7 +196,7 @@ public void selectNestedFields() throws IOException { String.format(Locale.ROOT, "SELECT nested(message.info), someField FROM %s", TestsConstants.TEST_INDEX_NESTED_TYPE)); - List fields = Arrays.asList("message.info", "someField"); + List fields = Arrays.asList("nested(message.info)", "someField"); assertContainsColumns(getSchema(response), fields); assertContainsData(getDataRows(response), fields); diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java b/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java index dbd37835a7a..c2b5d64f523 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/SQLIntegTestCase.java @@ -516,6 +516,10 @@ public enum Index { "nestedType", getNestedTypeIndexMapping(), "src/test/resources/nested_objects.json"), + NESTED_WITHOUT_ARRAYS(TestsConstants.TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS, + "nestedTypeWithoutArrays", + getNestedTypeIndexMapping(), + "src/test/resources/nested_objects_without_arrays.json"), NESTED_WITH_QUOTES(TestsConstants.TEST_INDEX_NESTED_WITH_QUOTES, "nestedType", getNestedTypeIndexMapping(), @@ -604,11 +608,18 @@ public enum Index { "wildcard", getMappingFile("wildcard_index_mappings.json"), "src/test/resources/wildcard.json"), - DATASOURCES(TestsConstants.DATASOURCES, "datasource", getMappingFile("datasources_index_mappings.json"), - "src/test/resources/datasources.json"); + "src/test/resources/datasources.json"), + MULTI_NESTED(TestsConstants.TEST_INDEX_MULTI_NESTED_TYPE, + "multi_nested", + getMappingFile("multi_nested.json"), + "src/test/resources/multi_nested_objects.json"), + NESTED_WITH_NULLS(TestsConstants.TEST_INDEX_NESTED_WITH_NULLS, + "multi_nested", + getNestedTypeIndexMapping(), + "src/test/resources/nested_with_nulls.json"); private final String name; private final String type; diff --git a/integ-test/src/test/java/org/opensearch/sql/legacy/TestsConstants.java b/integ-test/src/test/java/org/opensearch/sql/legacy/TestsConstants.java index e46993cd172..c3af98b7945 100644 --- a/integ-test/src/test/java/org/opensearch/sql/legacy/TestsConstants.java +++ b/integ-test/src/test/java/org/opensearch/sql/legacy/TestsConstants.java @@ -31,6 +31,8 @@ public class TestsConstants { public final static String TEST_INDEX_LOCATION = TEST_INDEX + "_location"; public final static String TEST_INDEX_LOCATION2 = TEST_INDEX + "_location2"; public final static String TEST_INDEX_NESTED_TYPE = TEST_INDEX + "_nested_type"; + public final static String TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS = + TEST_INDEX + "_nested_type_without_arrays"; public final static String TEST_INDEX_NESTED_SIMPLE = TEST_INDEX + "_nested_simple"; public final static String TEST_INDEX_NESTED_WITH_QUOTES = TEST_INDEX + "_nested_type_with_quotes"; @@ -55,6 +57,8 @@ public class TestsConstants { public final static String TEST_INDEX_NULL_MISSING = TEST_INDEX + "_null_missing"; public final static String TEST_INDEX_CALCS = TEST_INDEX + "_calcs"; public final static String TEST_INDEX_WILDCARD = TEST_INDEX + "_wildcard"; + public final static String TEST_INDEX_MULTI_NESTED_TYPE = TEST_INDEX + "_multi_nested"; + public final static String TEST_INDEX_NESTED_WITH_NULLS = TEST_INDEX + "_nested_with_nulls"; public final static String DATASOURCES = ".ql-datasources"; public final static String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"; diff --git a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java index 74613ee5b16..595fd8acd5f 100644 --- a/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/ppl/StandaloneIT.java @@ -29,6 +29,15 @@ import org.opensearch.common.inject.Singleton; import org.opensearch.sql.analysis.Analyzer; import org.opensearch.sql.analysis.ExpressionAnalyzer; +import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.datasources.service.DataSourceMetadataStorage; +import org.opensearch.sql.datasource.DataSourceService; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.sql.datasources.auth.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasource.model.DataSourceMetadata; +import org.opensearch.sql.executor.ExecutionEngine; +import org.opensearch.sql.executor.ExecutionEngine.QueryResponse; import org.opensearch.sql.executor.QueryManager; import org.opensearch.sql.executor.QueryService; import org.opensearch.sql.executor.execution.QueryPlanFactory; @@ -47,15 +56,6 @@ import org.opensearch.sql.sql.antlr.SQLSyntaxParser; import org.opensearch.sql.storage.StorageEngine; import org.opensearch.sql.util.ExecuteOnCallerThreadQueryManager; -import org.opensearch.sql.common.response.ResponseListener; -import org.opensearch.sql.common.setting.Settings; -import org.opensearch.sql.datasource.DataSourceMetadataStorage; -import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; -import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; -import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.ExecutionEngine.QueryResponse; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.client.OpenSearchRestClient; import org.opensearch.sql.opensearch.security.SecurityAccess; diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/IdentifierIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/IdentifierIT.java index 591364ea190..d5c194968d9 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/IdentifierIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/IdentifierIT.java @@ -64,6 +64,65 @@ public void testMultipleQueriesWithSpecialIndexNames() throws IOException { queryAndAssertTheDoc("SELECT * FROM test.two"); } + @Test + public void testDoubleUnderscoreIdentifierTest() throws IOException { + new Index("test.twounderscores") + .addDoc("{\"__age\": 30}"); + final JSONObject result = new JSONObject(executeQuery("SELECT __age FROM test.twounderscores", "jdbc")); + + verifySchema(result, + schema("__age", null, "long")); + verifyDataRows(result, rows(30)); + } + + @Test + public void testMetafieldIdentifierTest() throws IOException { + // create an index, but the contents doesn't matter + String id = "12345"; + String index = "test.metafields"; + new Index(index).addDoc("{\"age\": 30}", id); + + // Execute using field metadata values + final JSONObject result = new JSONObject(executeQuery( + "SELECT *, _id, _index, _score, _maxscore, _sort " + + "FROM " + index, + "jdbc")); + + // Verify that the metadata values are returned when requested + verifySchema(result, + schema("age", null, "long"), + schema("_id", null, "keyword"), + schema("_index", null, "keyword"), + schema("_score", null, "float"), + schema("_maxscore", null, "float"), + schema("_sort", null, "long")); + verifyDataRows(result, rows(30, id, index, 1.0, 1.0, -2)); + } + + @Test + public void testMetafieldIdentifierWithAliasTest() throws IOException { + // create an index, but the contents doesn't matter + String id = "99999"; + String index = "test.aliasmetafields"; + new Index(index).addDoc("{\"age\": 30}", id); + + // Execute using field metadata values + final JSONObject result = new JSONObject(executeQuery( + "SELECT _id AS A, _index AS B, _score AS C, _maxscore AS D, _sort AS E " + + "FROM " + index + " " + + "WHERE _id = \\\"" + id + "\\\"", + "jdbc")); + + // Verify that the metadata values are returned when requested + verifySchema(result, + schema("_id", "A", "keyword"), + schema("_index", "B", "keyword"), + schema("_score", "C", "float"), + schema("_maxscore", "D", "float"), + schema("_sort", "E", "long")); + verifyDataRows(result, rows(id, index, null, null, -2)); + } + private void createIndexWithOneDoc(String... indexNames) throws IOException { for (String indexName : indexNames) { new Index(indexName).addDoc("{\"age\": 30}"); @@ -98,6 +157,12 @@ void addDoc(String doc) { indexDoc.setJsonEntity(doc); performRequest(client(), indexDoc); } + + void addDoc(String doc, String id) { + Request indexDoc = new Request("POST", String.format("/%s/_doc/%s?refresh=true", indexName, id)); + indexDoc.setJsonEntity(doc); + performRequest(client(), indexDoc); + } } } diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/MatchIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/MatchIT.java index 28573fdd10f..9885ddfa330 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/MatchIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/MatchIT.java @@ -5,6 +5,7 @@ package org.opensearch.sql.sql; +import static org.hamcrest.Matchers.containsString; import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_ACCOUNT; import static org.opensearch.sql.util.MatcherUtils.rows; import static org.opensearch.sql.util.MatcherUtils.schema; @@ -12,9 +13,12 @@ import static org.opensearch.sql.util.MatcherUtils.verifySchema; import java.io.IOException; +import java.util.Locale; import org.json.JSONObject; +import org.junit.Assert; import org.junit.Test; import org.opensearch.sql.legacy.SQLIntegTestCase; +import org.opensearch.sql.legacy.TestsConstants; import org.opensearch.sql.legacy.utils.StringUtils; public class MatchIT extends SQLIntegTestCase { @@ -147,4 +151,14 @@ public void match_alternate_syntaxes_return_the_same_results() throws IOExceptio assertEquals(result1.getInt("total"), result2.getInt("total")); assertEquals(result1.getInt("total"), result3.getInt("total")); } + + @Test + public void matchPhraseQueryTest() throws IOException { + final String result = explainQuery(String.format(Locale.ROOT, + "select address from %s " + + "where address= matchPhrase('671 Bristol Street') order by _score desc limit 3", + TestsConstants.TEST_INDEX_ACCOUNT)); + Assert.assertThat(result, + containsString("{\\\"match_phrase\\\":{\\\"address\\\":{\\\"query\\\":\\\"671 Bristol Street\\\"")); + } } diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/NestedIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/NestedIT.java new file mode 100644 index 00000000000..a7b5d46234f --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/NestedIT.java @@ -0,0 +1,260 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_MULTI_NESTED_TYPE; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_NESTED_TYPE; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS; +import static org.opensearch.sql.legacy.TestsConstants.TEST_INDEX_NESTED_WITH_NULLS; +import static org.opensearch.sql.util.MatcherUtils.rows; +import static org.opensearch.sql.util.MatcherUtils.schema; +import static org.opensearch.sql.util.MatcherUtils.verifyDataRows; +import static org.opensearch.sql.util.MatcherUtils.verifySchema; + +import java.io.IOException; +import java.util.List; +import java.util.Map; + +import org.json.JSONArray; +import org.json.JSONObject; +import org.junit.Test; +import org.junit.jupiter.api.Disabled; +import org.opensearch.sql.legacy.SQLIntegTestCase; + +public class NestedIT extends SQLIntegTestCase { + @Override + public void init() throws IOException { + loadIndex(Index.MULTI_NESTED); + loadIndex(Index.NESTED); + loadIndex(Index.NESTED_WITHOUT_ARRAYS); + loadIndex(Index.EMPLOYEE_NESTED); + loadIndex(Index.NESTED_WITH_NULLS); + } + + @Test + public void nested_function_with_array_of_nested_field_test() { + String query = "SELECT nested(message.info), nested(comment.data) FROM " + TEST_INDEX_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + + assertEquals(6, result.getInt("total")); + verifyDataRows(result, + rows("c", "ab"), + rows("a", "ab"), + rows("b", "aa"), + rows("c", "aa"), + rows("a", "ab"), + rows("zz", new JSONArray(List.of("aa", "bb")))); + } + + @Test + public void nested_function_in_select_test() { + String query = "SELECT nested(message.info), nested(comment.data), " + + "nested(message.dayOfWeek) FROM " + + TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS; + JSONObject result = executeJdbcRequest(query); + + assertEquals(5, result.getInt("total")); + verifySchema(result, + schema("nested(message.info)", null, "keyword"), + schema("nested(comment.data)", null, "keyword"), + schema("nested(message.dayOfWeek)", null, "long")); + verifyDataRows(result, + rows("a", "ab", 1), + rows("b", "aa", 2), + rows("c", "aa", 1), + rows("c", "ab", 4), + rows("zz", "bb", 6)); + } + + // Has to be tested with JSON format when https://github.com/opensearch-project/sql/issues/1317 + // gets resolved + @Disabled // TODO fix me when aggregation is supported + public void nested_function_in_an_aggregate_function_in_select_test() { + String query = "SELECT sum(nested(message.dayOfWeek)) FROM " + + TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS; + JSONObject result = executeJdbcRequest(query); + verifyDataRows(result, rows(14)); + } + + // TODO Enable me when nested aggregation is supported + @Disabled + public void nested_function_with_arrays_in_an_aggregate_function_in_select_test() { + String query = "SELECT sum(nested(message.dayOfWeek)) FROM " + + TEST_INDEX_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + verifyDataRows(result, rows(19)); + } + + // TODO not currently supported by legacy, should we add implementation in AstBuilder? + @Disabled + public void nested_function_in_a_function_in_select_test() { + String query = "SELECT upper(nested(message.info)) FROM " + + TEST_INDEX_NESTED_TYPE_WITHOUT_ARRAYS; + JSONObject result = executeJdbcRequest(query); + + verifyDataRows(result, + rows("A"), + rows("B"), + rows("C"), + rows("C"), + rows("ZZ")); + } + + @Test + public void nested_function_with_array_of_multi_nested_field_test() { + String query = "SELECT nested(message.author.name) FROM " + TEST_INDEX_MULTI_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + + assertEquals(6, result.getInt("total")); + verifyDataRows(result, + rows("e"), + rows("f"), + rows("g"), + rows("h"), + rows("p"), + rows("yy")); + } + + @Test + public void nested_function_with_null_and_missing_fields_test() { + String query = "SELECT nested(message.info), nested(comment.data) FROM " + + TEST_INDEX_NESTED_WITH_NULLS; + JSONObject result = executeJdbcRequest(query); + + assertEquals(10, result.getInt("total")); + verifyDataRows(result, + rows(null, "hh"), + rows("b", "aa"), + rows("c", "aa"), + rows("c", "ab"), + rows("a", "ab"), + rows("zz", new JSONArray(List.of("aa", "bb"))), + rows("zz", new JSONArray(List.of("aa", "bb"))), + rows(null, "ee"), + rows("a", "ab"), + rows("rr", new JSONArray(List.of("asdf", "sdfg")))); + } + + @Test + public void nested_function_multiple_fields_with_matched_and_mismatched_paths_test() { + String query = + "SELECT nested(message.author), nested(message.dayOfWeek), nested(message.info), nested(comment.data), " + + "nested(comment.likes) FROM " + TEST_INDEX_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + + assertEquals(6, result.getInt("total")); + verifyDataRows(result, + rows("e", 1, "a", "ab", 3), + rows("f", 2, "b", "aa", 2), + rows("g", 1, "c", "aa", 3), + rows("h", 4, "c", "ab", 1), + rows("i", 5, "a", "ab", 1), + rows("zz", 6, "zz", new JSONArray(List.of("aa", "bb")), 10)); + } + + @Test + public void nested_function_mixed_with_non_nested_type_test() { + String query = + "SELECT nested(message.info), someField FROM " + TEST_INDEX_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + + assertEquals(6, result.getInt("total")); + verifyDataRows(result, + rows("a", "b"), + rows("b", "a"), + rows("c", "a"), + rows("c", "b"), + rows("a", "b"), + rows("zz", "a")); + } + + @Test + public void nested_function_mixed_with_non_nested_types_test() { + String query = + "SELECT nested(message.info), office, office.west FROM " + TEST_INDEX_MULTI_NESTED_TYPE; + JSONObject result = executeJdbcRequest(query); + + assertEquals(6, result.getInt("total")); + verifyDataRows(result, + rows("a", + new JSONObject(Map.of("south", 3, "west", "ab")), "ab"), + rows("b", + new JSONObject(Map.of("south", 5, "west", "ff")), "ff"), + rows("c", + new JSONObject(Map.of("south", 3, "west", "ll")), "ll"), + rows("d", null, null), + rows("i", null, null), + rows("zz", null, null)); + } + + @Test + public void nested_function_with_relevance_query() { + String query = + "SELECT nested(message.info), highlight(someField) FROM " + + TEST_INDEX_NESTED_TYPE + " WHERE match(someField, 'b')"; + JSONObject result = executeJdbcRequest(query); + + assertEquals(3, result.getInt("total")); + verifyDataRows(result, + rows("a", new JSONArray(List.of("b"))), + rows("c", new JSONArray(List.of("b"))), + rows("a", new JSONArray(List.of("b")))); + } + + @Test + public void nested_with_non_nested_type_test() { + String query = "SELECT nested(someField) FROM " + TEST_INDEX_NESTED_TYPE; + + Exception exception = assertThrows(RuntimeException.class, + () -> executeJdbcRequest(query)); + assertTrue(exception.getMessage().contains( + "{\n" + + " \"error\": {\n" + + " \"reason\": \"Invalid SQL query\",\n" + + " \"details\": \"Illegal nested field name: someField\",\n" + + " \"type\": \"IllegalArgumentException\"\n" + + " },\n" + + " \"status\": 400\n" + + "}" + )); + } + + @Test + public void nested_missing_path() { + String query = "SELECT nested(message.invalid) FROM " + TEST_INDEX_MULTI_NESTED_TYPE; + + Exception exception = assertThrows(RuntimeException.class, + () -> executeJdbcRequest(query)); + assertTrue(exception.getMessage().contains("" + + "{\n" + + " \"error\": {\n" + + " \"reason\": \"Invalid SQL query\",\n" + + " \"details\": \"can't resolve Symbol(namespace=FIELD_NAME, name=message.invalid) in type env\",\n" + + " \"type\": \"SemanticCheckException\"\n" + + " },\n" + + " \"status\": 400\n" + + "}" + )); + } + + @Test + public void nested_missing_path_argument() { + String query = "SELECT nested(message.author.name, invalid) FROM " + TEST_INDEX_MULTI_NESTED_TYPE; + + Exception exception = assertThrows(RuntimeException.class, + () -> executeJdbcRequest(query)); + assertTrue(exception.getMessage().contains("" + + "{\n" + + " \"error\": {\n" + + " \"reason\": \"Invalid SQL query\",\n" + + " \"details\": \"can't resolve Symbol(namespace=FIELD_NAME, name=invalid) in type env\",\n" + + " \"type\": \"SemanticCheckException\"\n" + + " },\n" + + " \"status\": 400\n" + + "}" + )); + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/ScoreQueryIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/ScoreQueryIT.java new file mode 100644 index 00000000000..03df7d0e29b --- /dev/null +++ b/integ-test/src/test/java/org/opensearch/sql/sql/ScoreQueryIT.java @@ -0,0 +1,142 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.sql.sql; + +import org.json.JSONObject; +import org.junit.Assert; +import org.junit.Test; +import org.opensearch.sql.legacy.SQLIntegTestCase; +import org.opensearch.sql.legacy.TestsConstants; + +import java.io.IOException; +import java.util.Locale; + +import static org.hamcrest.Matchers.containsString; +import static org.opensearch.sql.util.MatcherUtils.rows; +import static org.opensearch.sql.util.MatcherUtils.schema; +import static org.opensearch.sql.util.MatcherUtils.verifyDataRows; +import static org.opensearch.sql.util.MatcherUtils.verifySchema; + +public class ScoreQueryIT extends SQLIntegTestCase { + @Override + protected void init() throws Exception { + loadIndex(Index.ACCOUNT); + } + + /** + * "query" : { + * "from": 0, + * "size": 3, + * "timeout": "1m", + * "query": { + * "bool": { + * "should": [ + * { + * "match": { + * "address": { + * "query": "Lane", + * "operator": "OR", + * "prefix_length": 0, + * "max_expansions": 50, + * "fuzzy_transpositions": true, + * "lenient": false, + * "zero_terms_query": "NONE", + * "auto_generate_synonyms_phrase_query": true, + * "boost": 100.0 + * } + * } + * }, + * { + * "match": { + * "address": { + * "query": "Street", + * "operator": "OR", + * "prefix_length": 0, + * "max_expansions": 50, + * "fuzzy_transpositions": true, + * "lenient": false, + * "zero_terms_query": "NONE", + * "auto_generate_synonyms_phrase_query": true, + * "boost": 0.5 + * } + * } + * } + * ], + * "adjust_pure_negative": true, + * "boost": 1.0 + * } + * }, + * "_source": { + * "includes": [ + * "address" + * ], + * "excludes": [] + * }, + * "sort": [ + * { + * "_score": { + * "order": "desc" + * } + * } + * ], + * "track_scores": true + * } + * @throws IOException + */ + @Test + public void scoreQueryExplainTest() throws IOException { + final String result = explainQuery(String.format(Locale.ROOT, + "select address from %s " + + "where score(matchQuery(address, 'Douglass'), 100) " + + "or score(matchQuery(address, 'Hall'), 0.5) order by _score desc limit 2", + TestsConstants.TEST_INDEX_ACCOUNT)); + Assert.assertThat(result, containsString("\\\"match\\\":{\\\"address\\\":{\\\"query\\\":\\\"Douglass\\\"")); + Assert.assertThat(result, containsString("\\\"boost\\\":100.0")); + Assert.assertThat(result, containsString("\\\"match\\\":{\\\"address\\\":{\\\"query\\\":\\\"Hall\\\"")); + Assert.assertThat(result, containsString("\\\"boost\\\":0.5")); + Assert.assertThat(result, containsString("\\\"sort\\\":[{\\\"_score\\\"")); + Assert.assertThat(result, containsString("\\\"track_scores\\\":true")); + } + + @Test + public void scoreQueryTest() throws IOException { + final JSONObject result = new JSONObject(executeQuery(String.format(Locale.ROOT, + "select address, _score from %s " + + "where score(matchQuery(address, 'Douglass'), 100) " + + "or score(matchQuery(address, 'Hall'), 0.5) order by _score desc limit 2", + TestsConstants.TEST_INDEX_ACCOUNT), "jdbc")); + verifySchema(result, + schema("address", null, "text"), + schema("_score", null, "float")); + verifyDataRows(result, + rows("154 Douglass Street", 650.1515), + rows("565 Hall Street", 3.2507575)); + } + + @Test + public void scoreQueryDefaultBoostExplainTest() throws IOException { + final String result = explainQuery(String.format(Locale.ROOT, + "select address from %s " + + "where score(matchQuery(address, 'Lane')) order by _score desc limit 2", + TestsConstants.TEST_INDEX_ACCOUNT)); + Assert.assertThat(result, containsString("\\\"match\\\":{\\\"address\\\":{\\\"query\\\":\\\"Lane\\\"")); + Assert.assertThat(result, containsString("\\\"boost\\\":1.0")); + Assert.assertThat(result, containsString("\\\"sort\\\":[{\\\"_score\\\"")); + Assert.assertThat(result, containsString("\\\"track_scores\\\":true")); + } + + @Test + public void scoreQueryDefaultBoostQueryTest() throws IOException { + final JSONObject result = new JSONObject(executeQuery(String.format(Locale.ROOT, + "select address, _score from %s " + + "where score(matchQuery(address, 'Powell')) order by _score desc limit 2", + TestsConstants.TEST_INDEX_ACCOUNT), "jdbc")); + verifySchema(result, + schema("address", null, "text"), + schema("_score", null, "float")); + verifyDataRows(result, rows("305 Powell Street", 6.501515)); + } +} diff --git a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java index 9ed91c4e38e..0095bec7cac 100644 --- a/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java +++ b/integ-test/src/test/java/org/opensearch/sql/sql/StandalonePaginationIT.java @@ -30,7 +30,7 @@ import org.opensearch.sql.common.setting.Settings; import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.pagination.PlanSerializer; import org.opensearch.sql.executor.QueryService; diff --git a/integ-test/src/test/resources/indexDefinitions/multi_nested.json b/integ-test/src/test/resources/indexDefinitions/multi_nested.json new file mode 100644 index 00000000000..d2da21d24c5 --- /dev/null +++ b/integ-test/src/test/resources/indexDefinitions/multi_nested.json @@ -0,0 +1,42 @@ +{ + "mappings": { + "properties": { + "message": { + "type": "nested", + "properties": { + "info": { + "type": "keyword" + }, + "author": { + "type": "nested", + "properties": { + "name": { + "type": "keyword", + "fields": { + "keyword": { + "type": "keyword", + "ignore_above": 256 + } + } + }, + "address": { + "type": "nested", + "properties": { + "street": { + "type": "keyword" + }, + "number": { + "type": "integer" + } + } + } + } + }, + "dayOfWeek": { + "type": "long" + } + } + } + } + } +} diff --git a/integ-test/src/test/resources/multi_nested_objects.json b/integ-test/src/test/resources/multi_nested_objects.json new file mode 100644 index 00000000000..c8cf91162a3 --- /dev/null +++ b/integ-test/src/test/resources/multi_nested_objects.json @@ -0,0 +1,10 @@ +{"index":{"_id":"1"}} +{"message":{"info":"a","author":{"name": "e", "address": {"street": "bc", "number": 1}},"dayOfWeek":1},"office":{"west":"ab","south":3}} +{"index":{"_id":"2"}} +{"message":{"info":"b","author":{"name": "f", "address": {"street": "ab", "number": 2}},"dayOfWeek":2},"office":{"west":"ff","south":5}} +{"index":{"_id":"3"}} +{"message":{"info":"c","author":{"name": "g", "address": {"street": "sk", "number": 3}},"dayOfWeek":1},"office":{"west":"ll","south":3}} +{"index":{"_id":"4"}} +{"message":[{"info":"d","author":{"name": "h", "address": {"street": "mb", "number": 4}},"dayOfWeek":4},{"info":"i","author":{"name": "p", "address": {"street": "on", "number": 5}},"dayOfWeek":5}]} +{"index":{"_id":"5"}} +{"message": [{"info":"zz","author":{"name": "yy", "address": {"street": "qc", "number": 6}},"dayOfWeek":6}]} diff --git a/integ-test/src/test/resources/nested_objects_without_arrays.json b/integ-test/src/test/resources/nested_objects_without_arrays.json new file mode 100644 index 00000000000..626e63e0798 --- /dev/null +++ b/integ-test/src/test/resources/nested_objects_without_arrays.json @@ -0,0 +1,10 @@ +{"index":{"_id":"1"}} +{"message":{"info":"a","author":"e","dayOfWeek":1},"comment":{"data":"ab","likes":3},"myNum":1,"someField":"b"} +{"index":{"_id":"2"}} +{"message":{"info":"b","author":"f","dayOfWeek":2},"comment":{"data":"aa","likes":2},"myNum":2,"someField":"a"} +{"index":{"_id":"3"}} +{"message":{"info":"c","author":"g","dayOfWeek":1},"comment":{"data":"aa","likes":3},"myNum":3,"someField":"a"} +{"index":{"_id":"4"}} +{"message":{"info":"c","author":"h","dayOfWeek":4},"comment":{"data":"ab","likes":1},"myNum":4,"someField":"b"} +{"index":{"_id":"5"}} +{"message": {"info":"zz","author":"zz","dayOfWeek":6},"comment":{"data":"bb","likes":10},"myNum":3,"someField":"a"} diff --git a/integ-test/src/test/resources/nested_with_nulls.json b/integ-test/src/test/resources/nested_with_nulls.json new file mode 100644 index 00000000000..b02a8ab1100 --- /dev/null +++ b/integ-test/src/test/resources/nested_with_nulls.json @@ -0,0 +1,24 @@ +{"index":{"_id":"1"}} +{"message":{"author":"e","dayOfWeek":5},"comment":{"data":"hh","likes":5},"myNum":7,"someField":"a"} +{"index":{"_id":"2"}} +{"message":{"info":"b","author":"f","dayOfWeek":2},"comment":{"data":"aa","likes":2},"myNum":2,"someField":"a"} +{"index":{"_id":"3"}} +{"message":{"info":"c","author":"g","dayOfWeek":1},"comment":{"data":"aa","likes":3},"myNum":3,"someField":"a"} +{"index":{"_id":"4"}} +{"message":[{"info":"c","author":"h","dayOfWeek":4},{"info":"a","author":"i","dayOfWeek":5}],"comment":{"data":"ab","likes":1},"myNum":4,"someField":"b"} +{"index":{"_id":"5"}} +{"message": [{"info":"zz","author":"zz","dayOfWeek":6}],"comment":{"data":["aa","bb"],"likes":10},"myNum":[3,4],"someField":"a"} +{"index":{"_id":"7"}} +{"message":[{"info":"zz", "author":"z\"z", "dayOfWeek":6}], "comment":{"data":["aa","bb"], "likes":10}, "myNum":[3,4], "someField":"a"} +{"index":{"_id":"8"}} +{"message":{"info":null,"author":"e","dayOfWeek":7},"comment":{"data":"ee","likes":6},"myNum":6,"someField":"a"} +{"index":{"_id":"9"}} +{"message":{"info":"a","author":"e","dayOfWeek":1},"comment":{"data":"ab","likes":3},"myNum":1,"someField":"b"} +{"index":{"_id":"10"}} +{"message":[{"info":"rr", "author":"this \"value\" contains quotes", "dayOfWeek":3}], "comment":{"data":["asdf","sdfg"], "likes":56}, "myNum":[1,2,4], "someField":"ert"} +{"index":{"_id":"11"}} +{"comment":{"data":"jj","likes":1},"myNum":8,"someField":"a"} +{"index":{"_id":"12"}} +{"message":null,"comment":{"data":"kk","likes":0},"myNum":9,"someField":"a"} +{"index":{"_id":"13"}} +{} diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java index b06e2b9e089..0c4548a368d 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactory.java @@ -15,9 +15,10 @@ import static org.opensearch.sql.utils.DateTimeFormatters.DATE_TIME_FORMATTER; import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; import com.google.common.collect.ImmutableMap; -import java.io.Serializable; +import com.google.common.collect.Iterators; import java.time.Instant; import java.time.format.DateTimeParseException; import java.util.ArrayList; @@ -55,7 +56,7 @@ /** * Construct ExprValue from OpenSearch response. */ -public class OpenSearchExprValueFactory implements Serializable { +public class OpenSearchExprValueFactory { /** * The Mapping of Field and ExprType. */ @@ -235,15 +236,20 @@ private ExprValue parseStruct(Content content, String prefix) { */ private ExprValue parseArray(Content content, String prefix) { List result = new ArrayList<>(); - content.array().forEachRemaining(v -> { - // ExprCoreType.ARRAY does not indicate inner elements type. OpenSearch nested will be an - // array of structs, otherwise parseArray currently only supports array of strings. - if (v.isString()) { - result.add(parse(v, prefix, Optional.of(OpenSearchDataType.of(STRING)))); - } else { - result.add(parse(v, prefix, Optional.of(STRUCT))); - } - }); + // ExprCoreType.ARRAY does not indicate inner elements type. + if (Iterators.size(content.array()) == 1 && content.objectValue() instanceof JsonNode) { + result.add(parse(content, prefix, Optional.of(STRUCT))); + } else { + content.array().forEachRemaining(v -> { + // ExprCoreType.ARRAY does not indicate inner elements type. OpenSearch nested will be an + // array of structs, otherwise parseArray currently only supports array of strings. + if (v.isString()) { + result.add(parse(v, prefix, Optional.of(OpenSearchDataType.of(STRING)))); + } else { + result.add(parse(v, prefix, Optional.of(STRUCT))); + } + }); + } return new ExprCollectionValue(result); } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java index 4d6925f1aaf..c46b0231a21 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java @@ -16,6 +16,7 @@ import org.opensearch.sql.planner.physical.EvalOperator; import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; +import org.opensearch.sql.planner.physical.NestedOperator; import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; @@ -94,6 +95,15 @@ public PhysicalPlan visitEval(EvalOperator node, Object context) { return new EvalOperator(visitInput(node.getInput(), context), node.getExpressionList()); } + @Override + public PhysicalPlan visitNested(NestedOperator node, Object context) { + return doProtect( + new NestedOperator( + visitInput(node.getInput(), context), node.getFields(), node.getGroupedPathsAndFields() + ) + ); + } + @Override public PhysicalPlan visitDedupe(DedupeOperator node, Object context) { return new DedupeOperator(visitInput(node.getInput(), context), node.getDedupeList(), diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java index 1ad62076823..4789a50896a 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequest.java @@ -5,6 +5,7 @@ package org.opensearch.sql.opensearch.request; +import java.util.List; import java.util.function.Consumer; import java.util.function.Function; import lombok.EqualsAndHashCode; @@ -48,7 +49,7 @@ public OpenSearchResponse search(Function searchA .scroll(scrollTimeout)); // TODO if terminated_early - something went wrong, e.g. no scroll returned. - var response = new OpenSearchResponse(openSearchResponse, exprValueFactory); + var response = new OpenSearchResponse(openSearchResponse, exprValueFactory, List.of()); // on the last empty page, we should close the scroll scrollFinished = response.isEmpty(); responseScrollId = openSearchResponse.getScrollId(); diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java index 4ffbbee9b70..b1a6589acab 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilder.java @@ -85,4 +85,14 @@ public void pushDownProjects(Set projects) { public void pushTypeMapping(Map typeMapping) { throw new UnsupportedOperationException("Cursor requests don't support any push down"); } + + @Override + public void pushDownNested(List> nestedArgs) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } + + @Override + public void pushDownTrackedScore(boolean trackScores) { + throw new UnsupportedOperationException("Cursor requests don't support any push down"); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java index bef734ce476..25b7253ecaa 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilder.java @@ -101,4 +101,14 @@ public void pushDownProjects(Set projects) { public void pushTypeMapping(Map typeMapping) { exprValueFactory.extendTypeMapping(typeMapping); } + + @Override + public void pushDownNested(List> nestedArgs) { + throw new UnsupportedOperationException("Pagination does not support nested function"); + } + + @Override + public void pushDownTrackedScore(boolean trackScores) { + throw new UnsupportedOperationException("Pagination does not support score function"); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java index 0795ce7cdc7..63aeed02f03 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequest.java @@ -9,6 +9,8 @@ import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; import com.google.common.annotations.VisibleForTesting; +import java.util.Arrays; +import java.util.List; import java.util.function.Consumer; import java.util.function.Function; import lombok.EqualsAndHashCode; @@ -19,6 +21,7 @@ import org.opensearch.action.search.SearchScrollRequest; import org.opensearch.search.SearchHits; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -90,11 +93,16 @@ public OpenSearchQueryRequest(IndexName indexName, SearchSourceBuilder sourceBui @Override public OpenSearchResponse search(Function searchAction, Function scrollAction) { + FetchSourceContext fetchSource = this.sourceBuilder.fetchSource(); + List includes = fetchSource != null && fetchSource.includes() != null + ? Arrays.asList(fetchSource.includes()) + : List.of(); if (searchDone) { - return new OpenSearchResponse(SearchHits.empty(), exprValueFactory); + return new OpenSearchResponse(SearchHits.empty(), exprValueFactory, includes); } else { searchDone = true; - return new OpenSearchResponse(searchAction.apply(searchRequest()), exprValueFactory); + return new OpenSearchResponse( + searchAction.apply(searchRequest()), exprValueFactory, includes); } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java index 6d5a8cf0054..f8d62ad7ce6 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilder.java @@ -6,6 +6,11 @@ package org.opensearch.sql.opensearch.request; +import static java.util.stream.Collectors.mapping; +import static java.util.stream.Collectors.toList; +import static org.opensearch.index.query.QueryBuilders.boolQuery; +import static org.opensearch.index.query.QueryBuilders.matchAllQuery; +import static org.opensearch.index.query.QueryBuilders.nestedQuery; import static org.opensearch.search.sort.FieldSortBuilder.DOC_FIELD_NAME; import static org.opensearch.search.sort.SortOrder.ASC; @@ -18,12 +23,16 @@ import lombok.Getter; import lombok.ToString; import org.apache.commons.lang3.tuple.Pair; +import org.apache.lucene.search.join.ScoreMode; import org.opensearch.common.unit.TimeValue; import org.opensearch.index.query.BoolQueryBuilder; +import org.opensearch.index.query.InnerHitBuilder; +import org.opensearch.index.query.NestedQueryBuilder; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; import org.opensearch.search.fetch.subphase.highlight.HighlightBuilder; import org.opensearch.search.sort.SortBuilder; import org.opensearch.search.sort.SortBuilders; @@ -35,6 +44,7 @@ import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; +import org.opensearch.sql.planner.logical.LogicalNested; /** * OpenSearch search request builder. @@ -104,7 +114,8 @@ public OpenSearchRequestBuilder(OpenSearchRequest.IndexName indexName, this.sourceBuilder = new SearchSourceBuilder() .from(0) .size(querySize) - .timeout(DEFAULT_QUERY_TIMEOUT); + .timeout(DEFAULT_QUERY_TIMEOUT) + .trackScores(false); } /** @@ -190,6 +201,11 @@ public void pushDownLimit(Integer limit, Integer offset) { sourceBuilder.from(offset).size(limit); } + @Override + public void pushDownTrackedScore(boolean trackScores) { + sourceBuilder.trackScores(trackScores); + } + /** * Add highlight to DSL requests. * @param field name of the field to highlight @@ -246,4 +262,78 @@ private boolean isSortByDocOnly() { } return false; } + + /** + * Push down nested to sourceBuilder. + * @param nestedArgs : Nested arguments to push down. + */ + @Override + public void pushDownNested(List> nestedArgs) { + initBoolQueryFilter(); + groupFieldNamesByPath(nestedArgs).forEach( + (path, fieldNames) -> buildInnerHit( + fieldNames, createEmptyNestedQuery(path) + ) + ); + } + + /** + * Initialize bool query for push down. + */ + private void initBoolQueryFilter() { + if (sourceBuilder.query() == null) { + sourceBuilder.query(QueryBuilders.boolQuery()); + } else { + sourceBuilder.query(QueryBuilders.boolQuery().must(sourceBuilder.query())); + } + + sourceBuilder.query(QueryBuilders.boolQuery().filter(sourceBuilder.query())); + } + + /** + * Map all field names in nested queries that use same path. + * @param fields : Fields for nested queries. + * @return : Map of path and associated field names. + */ + private Map> groupFieldNamesByPath( + List> fields) { + // TODO filter out reverse nested when supported - .filter(not(isReverseNested())) + return fields.stream().collect( + Collectors.groupingBy( + m -> m.get("path").toString(), + mapping( + m -> m.get("field").toString(), + toList() + ) + ) + ); + } + + /** + * Build inner hits portion to nested query. + * @param paths : Set of all paths used in nested queries. + * @param query : Current pushDown query. + */ + private void buildInnerHit(List paths, NestedQueryBuilder query) { + query.innerHit(new InnerHitBuilder().setFetchSourceContext( + new FetchSourceContext(true, paths.toArray(new String[0]), null) + )); + } + + /** + * Create a nested query with match all filter to place inner hits. + */ + private NestedQueryBuilder createEmptyNestedQuery(String path) { + NestedQueryBuilder nestedQuery = nestedQuery(path, matchAllQuery(), ScoreMode.None); + ((BoolQueryBuilder) query().filter().get(0)).must(nestedQuery); + return nestedQuery; + } + + /** + * Return current query. + * @return : Current source builder query. + */ + private BoolQueryBuilder query() { + return (BoolQueryBuilder) sourceBuilder.query(); + } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java index 2e723c949c8..77c6a781fe9 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequest.java @@ -6,6 +6,8 @@ package org.opensearch.sql.opensearch.request; +import java.util.Arrays; +import java.util.List; import java.util.Objects; import java.util.function.Consumer; import java.util.function.Function; @@ -18,6 +20,7 @@ import org.opensearch.action.search.SearchScrollRequest; import org.opensearch.common.unit.TimeValue; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -79,8 +82,12 @@ public OpenSearchResponse search(Function searchA } else { openSearchResponse = searchAction.apply(searchRequest()); } + FetchSourceContext fetchSource = this.sourceBuilder.fetchSource(); + List includes = fetchSource != null && fetchSource.includes() != null + ? Arrays.asList(this.sourceBuilder.fetchSource().includes()) + : List.of(); - var response = new OpenSearchResponse(openSearchResponse, exprValueFactory); + var response = new OpenSearchResponse(openSearchResponse, exprValueFactory, includes); if (!(needClean = response.isEmpty())) { setScrollId(openSearchResponse.getScrollId()); } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java index ce088359c21..59aa1949b6a 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/request/PushDownRequestBuilder.java @@ -41,4 +41,8 @@ void pushDownAggregation(Pair, void pushDownProjects(Set projects); void pushTypeMapping(Map typeMapping); + + void pushDownNested(List> nestedArgs); + + void pushDownTrackedScore(boolean trackScores); } \ No newline at end of file diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java index 61d4459a862..af43be1a383 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/response/OpenSearchResponse.java @@ -6,16 +6,29 @@ package org.opensearch.sql.opensearch.response; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATAFIELD_TYPE_MAP; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATA_FIELD_ID; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATA_FIELD_INDEX; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATA_FIELD_MAXSCORE; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATA_FIELD_SCORE; +import static org.opensearch.sql.opensearch.storage.OpenSearchIndex.METADATA_FIELD_SORT; + import com.google.common.collect.ImmutableMap; import java.util.Arrays; +import java.util.HashSet; import java.util.Iterator; +import java.util.List; import java.util.Map; +import java.util.Set; import java.util.stream.Collectors; import lombok.EqualsAndHashCode; import lombok.ToString; import org.opensearch.action.search.SearchResponse; import org.opensearch.search.SearchHits; import org.opensearch.search.aggregations.Aggregations; +import org.opensearch.sql.data.model.ExprFloatValue; +import org.opensearch.sql.data.model.ExprLongValue; +import org.opensearch.sql.data.model.ExprStringValue; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.data.model.ExprValueUtils; @@ -38,6 +51,11 @@ public class OpenSearchResponse implements Iterable { */ private final Aggregations aggregations; + /** + * List of requested include fields. + */ + private final List includes; + /** * OpenSearchExprValueFactory used to build ExprValue from search result. */ @@ -48,19 +66,24 @@ public class OpenSearchResponse implements Iterable { * Constructor of OpenSearchResponse. */ public OpenSearchResponse(SearchResponse searchResponse, - OpenSearchExprValueFactory exprValueFactory) { + OpenSearchExprValueFactory exprValueFactory, + List includes) { this.hits = searchResponse.getHits(); this.aggregations = searchResponse.getAggregations(); this.exprValueFactory = exprValueFactory; + this.includes = includes; } /** * Constructor of OpenSearchResponse with SearchHits. */ - public OpenSearchResponse(SearchHits hits, OpenSearchExprValueFactory exprValueFactory) { + public OpenSearchResponse(SearchHits hits, + OpenSearchExprValueFactory exprValueFactory, + List includes) { this.hits = hits; this.aggregations = null; this.exprValueFactory = exprValueFactory; + this.includes = includes; } /** @@ -96,14 +119,43 @@ public Iterator iterator() { return (ExprValue) ExprTupleValue.fromExprValueMap(builder.build()); }).iterator(); } else { + List metaDataFieldSet = includes.stream() + .filter(include -> METADATAFIELD_TYPE_MAP.containsKey(include)) + .collect(Collectors.toList()); + ExprFloatValue maxScore = Float.isNaN(hits.getMaxScore()) + ? null : new ExprFloatValue(hits.getMaxScore()); return Arrays.stream(hits.getHits()) .map(hit -> { - ExprValue docData = exprValueFactory.construct(hit.getSourceAsString()); - if (hit.getHighlightFields().isEmpty()) { - return docData; - } else { - ImmutableMap.Builder builder = new ImmutableMap.Builder<>(); + String source = hit.getSourceAsString(); + ExprValue docData = exprValueFactory.construct(source); + + ImmutableMap.Builder builder = new ImmutableMap.Builder<>(); + if (hit.getInnerHits() == null || hit.getInnerHits().isEmpty()) { builder.putAll(docData.tupleValue()); + } else { + Map rowSource = hit.getSourceAsMap(); + builder.putAll(ExprValueUtils.tupleValue(rowSource).tupleValue()); + } + + metaDataFieldSet.forEach(metaDataField -> { + if (metaDataField.equals(METADATA_FIELD_INDEX)) { + builder.put(METADATA_FIELD_INDEX, new ExprStringValue(hit.getIndex())); + } else if (metaDataField.equals(METADATA_FIELD_ID)) { + builder.put(METADATA_FIELD_ID, new ExprStringValue(hit.getId())); + } else if (metaDataField.equals(METADATA_FIELD_SCORE)) { + if (!Float.isNaN(hit.getScore())) { + builder.put(METADATA_FIELD_SCORE, new ExprFloatValue(hit.getScore())); + } + } else if (metaDataField.equals(METADATA_FIELD_MAXSCORE)) { + if (maxScore != null) { + builder.put(METADATA_FIELD_MAXSCORE, maxScore); + } + } else { // if (metaDataField.equals(METADATA_FIELD_SORT)) { + builder.put(METADATA_FIELD_SORT, new ExprLongValue(hit.getSeqNo())); + } + }); + + if (!hit.getHighlightFields().isEmpty()) { var hlBuilder = ImmutableMap.builder(); for (var es : hit.getHighlightFields().entrySet()) { hlBuilder.put(es.getKey(), ExprValueUtils.collectionValue( @@ -111,8 +163,8 @@ public Iterator iterator() { t -> (t.toString())).collect(Collectors.toList()))); } builder.put("_highlight", ExprTupleValue.fromExprValueMap(hlBuilder.build())); - return ExprTupleValue.fromExprValueMap(builder.build()); } + return (ExprValue) ExprTupleValue.fromExprValueMap(builder.build()); }).iterator(); } } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java index 110d3d640f0..949b1e53ecb 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/OpenSearchIndex.java @@ -12,6 +12,7 @@ import java.util.Map; import lombok.RequiredArgsConstructor; import org.opensearch.sql.common.setting.Settings; +import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; @@ -39,6 +40,20 @@ /** OpenSearch table (index) implementation. */ public class OpenSearchIndex implements Table { + public static final String METADATA_FIELD_ID = "_id"; + public static final String METADATA_FIELD_INDEX = "_index"; + public static final String METADATA_FIELD_SCORE = "_score"; + public static final String METADATA_FIELD_MAXSCORE = "_maxscore"; + public static final String METADATA_FIELD_SORT = "_sort"; + + public static final java.util.Map METADATAFIELD_TYPE_MAP = Map.of( + METADATA_FIELD_ID, ExprCoreType.STRING, + METADATA_FIELD_INDEX, ExprCoreType.STRING, + METADATA_FIELD_SCORE, ExprCoreType.FLOAT, + METADATA_FIELD_MAXSCORE, ExprCoreType.FLOAT, + METADATA_FIELD_SORT, ExprCoreType.LONG + ); + /** OpenSearch client connection. */ private final OpenSearchClient client; @@ -116,6 +131,11 @@ public Map getFieldTypes() { return cachedFieldTypes; } + @Override + public Map getReservedFieldTypes() { + return METADATAFIELD_TYPE_MAP; + } + /** * Get parsed mapping info. * @return A complete map between field names and their types. @@ -156,10 +176,11 @@ public LogicalPlan optimize(LogicalPlan plan) { @Override public TableScanBuilder createScanBuilder() { - var requestBuilder = new OpenSearchRequestBuilder(indexName, getMaxResultWindow(), - settings, new OpenSearchExprValueFactory(getFieldOpenSearchTypes())); - OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, requestBuilder); - + Map allFields = new HashMap<>(); + getReservedFieldTypes().forEach((k, v) -> allFields.put(k, OpenSearchDataType.of(v))); + allFields.putAll(getFieldOpenSearchTypes()); + OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, indexName, + getMaxResultWindow(), new OpenSearchExprValueFactory(allFields)); return new OpenSearchIndexScanBuilder(indexScan); } diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java index 27529bdffd2..2171fb564f3 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScan.java @@ -52,9 +52,30 @@ public class OpenSearchIndexScan extends TableScanOperator { /** Search response for current batch. */ private Iterator iterator; - public OpenSearchIndexScan(OpenSearchClient client, OpenSearchRequestBuilder builder) { + /** + * Constructor. + */ + public OpenSearchIndexScan(OpenSearchClient client, Settings settings, + String indexName, Integer maxResultWindow, + OpenSearchExprValueFactory exprValueFactory) { + this( + client, + settings, + new OpenSearchRequest.IndexName(indexName), + maxResultWindow, + exprValueFactory + ); + } + + /** + * Constructor. + */ + public OpenSearchIndexScan(OpenSearchClient client, Settings settings, + OpenSearchRequest.IndexName indexName, Integer maxResultWindow, + OpenSearchExprValueFactory exprValueFactory) { this.client = client; - this.requestBuilder = builder; + this.requestBuilder = new OpenSearchRequestBuilder( + indexName, maxResultWindow, settings, exprValueFactory); } @Override diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java index 41edbfc768a..024331d267b 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanBuilder.java @@ -12,6 +12,7 @@ import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalProject; import org.opensearch.sql.planner.logical.LogicalSort; import org.opensearch.sql.storage.TableScanOperator; @@ -96,6 +97,11 @@ public boolean pushDownHighlight(LogicalHighlight highlight) { return delegate.pushDownHighlight(highlight); } + @Override + public boolean pushDownNested(LogicalNested nested) { + return delegate.pushDownNested(nested); + } + private boolean sortByFieldsOnly(LogicalSort sort) { return sort.getSortList().stream() .map(sortItem -> sortItem.getRight() instanceof ReferenceExpression) diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java index 7e6c169a88e..d9b4e6b4e02 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanQueryBuilder.java @@ -18,14 +18,17 @@ import org.opensearch.sql.common.utils.StringUtils; import org.opensearch.sql.expression.Expression; import org.opensearch.sql.expression.ExpressionNodeVisitor; +import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.function.OpenSearchFunctions; import org.opensearch.sql.opensearch.storage.script.filter.FilterQueryBuilder; import org.opensearch.sql.opensearch.storage.script.sort.SortQueryBuilder; import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalProject; import org.opensearch.sql.planner.logical.LogicalSort; import org.opensearch.sql.storage.TableScanOperator; @@ -60,8 +63,11 @@ public TableScanOperator build() { public boolean pushDownFilter(LogicalFilter filter) { FilterQueryBuilder queryBuilder = new FilterQueryBuilder( new DefaultExpressionSerializer()); - QueryBuilder query = queryBuilder.build(filter.getCondition()); + Expression queryCondition = filter.getCondition(); + QueryBuilder query = queryBuilder.build(queryCondition); indexScan.getRequestBuilder().pushDownFilter(query); + indexScan.getRequestBuilder().pushDownTrackedScore( + trackScoresFromOpenSearchFunction(queryCondition)); return true; } @@ -98,6 +104,30 @@ public boolean pushDownHighlight(LogicalHighlight highlight) { return true; } + private boolean trackScoresFromOpenSearchFunction(Expression condition) { + if (condition instanceof OpenSearchFunctions.OpenSearchFunction + && ((OpenSearchFunctions.OpenSearchFunction) condition).isScoreTracked()) { + return true; + } + if (condition instanceof FunctionExpression) { + return ((FunctionExpression) condition).getArguments().stream() + .anyMatch(this::trackScoresFromOpenSearchFunction); + } + return false; + } + + @Override + public boolean pushDownNested(LogicalNested nested) { + indexScan.getRequestBuilder().pushDownNested(nested.getFields()); + indexScan.getRequestBuilder().pushDownProjects( + findReferenceExpressions(nested.getProjectList())); + // Return false intentionally to keep the original nested operator + // Since we return false we need to pushDownProject here as it won't be + // pushed down due to no matching push down rule. + // TODO: improve LogicalPlanOptimizer pushdown api. + return false; + } + /** * Find reference expression from expression. * @param expressions a list of expression. diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/sort/SortQueryBuilder.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/sort/SortQueryBuilder.java index ab8f086dff9..1415fc22c60 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/sort/SortQueryBuilder.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/storage/script/sort/SortQueryBuilder.java @@ -49,6 +49,9 @@ public class SortQueryBuilder { */ public SortBuilder build(Expression expression, Sort.SortOption option) { if (expression instanceof ReferenceExpression) { + if (((ReferenceExpression) expression).getAttr().equalsIgnoreCase("_score")) { + return SortBuilders.scoreSort().order(sortOrderMap.get(option.getSortOrder())); + } return fieldBuild((ReferenceExpression) expression, option); } else { throw new IllegalStateException("unsupported expression " + expression.getClass()); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java index dc9d7a5b5ed..2ed02a61c28 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchNodeClientTest.java @@ -317,6 +317,7 @@ void search() { new TotalHits(1L, TotalHits.Relation.EQUAL_TO), 1.0F)); when(searchHit.getSourceAsString()).thenReturn("{\"id\", 1}"); + when(searchHit.getInnerHits()).thenReturn(null); when(factory.construct(any())).thenReturn(exprTupleValue); // Mock second scroll request followed @@ -333,7 +334,7 @@ void search() { Iterator hits = response1.iterator(); assertTrue(hits.hasNext()); - assertEquals(exprTupleValue, hits.next()); + assertEquals(exprTupleValue.tupleValue().get("id"), hits.next().tupleValue().get("id")); assertFalse(hits.hasNext()); // Verify response for second scroll request diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java index 6abd17a6fbc..ea463405b92 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/client/OpenSearchRestClientTest.java @@ -298,6 +298,7 @@ void search() throws IOException { new TotalHits(1L, TotalHits.Relation.EQUAL_TO), 1.0F)); when(searchHit.getSourceAsString()).thenReturn("{\"id\", 1}"); + when(searchHit.getInnerHits()).thenReturn(null); when(factory.construct(any())).thenReturn(exprTupleValue); // Mock second scroll request followed @@ -314,7 +315,7 @@ void search() throws IOException { Iterator hits = response1.iterator(); assertTrue(hits.hasNext()); - assertEquals(exprTupleValue, hits.next()); + assertEquals(exprTupleValue.tupleValue().get("id"), hits.next().tupleValue().get("id")); assertFalse(hits.hasNext()); // Verify response for second scroll request diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactoryTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactoryTest.java index 1b9a8b7e65e..8f2c954f654 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactoryTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/data/value/OpenSearchExprValueFactoryTest.java @@ -34,6 +34,8 @@ import static org.opensearch.sql.data.type.ExprCoreType.TIME; import static org.opensearch.sql.data.type.ExprCoreType.TIMESTAMP; +import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.databind.ObjectMapper; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; import java.time.Instant; @@ -94,6 +96,23 @@ public void constructNullValue() { assertTrue(new OpenSearchJsonContent(null).isNull()); } + @Test + public void iterateArrayValue() throws JsonProcessingException { + ObjectMapper mapper = new ObjectMapper(); + var arrayIt = new OpenSearchJsonContent(mapper.readTree("[\"zz\",\"bb\"]")).array(); + assertTrue(arrayIt.next().stringValue().equals("zz")); + assertTrue(arrayIt.next().stringValue().equals("bb")); + assertTrue(!arrayIt.hasNext()); + } + + @Test + public void iterateArrayValueWithOneElement() throws JsonProcessingException { + ObjectMapper mapper = new ObjectMapper(); + var arrayIt = new OpenSearchJsonContent(mapper.readTree("[\"zz\"]")).array(); + assertTrue(arrayIt.next().stringValue().equals("zz")); + assertTrue(!arrayIt.hasNext()); + } + @Test public void constructNullArrayValue() { assertEquals(nullValue(), tupleValue("{\"intV\":[]}").get("intV")); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index 1f13470a439..ae7319a223b 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -180,9 +180,8 @@ void explain_successfully() { when(settings.getSettingValue(SQL_CURSOR_KEEP_ALIVE)) .thenReturn(TimeValue.timeValueMinutes(1)); - PhysicalPlan plan = new OpenSearchIndexScan(mock(OpenSearchClient.class), - new OpenSearchRequestBuilder("test", 10000, settings, - mock(OpenSearchExprValueFactory.class))); + PhysicalPlan plan = new OpenSearchIndexScan(mock(OpenSearchClient.class), settings, + "test", 10000, mock(OpenSearchExprValueFactory.class)); AtomicReference result = new AtomicReference<>(); executor.explain(plan, new ResponseListener() { diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java index cf684b9409c..4a2353df9b4 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java @@ -30,6 +30,7 @@ import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.Set; import org.apache.commons.lang3.tuple.ImmutablePair; import org.apache.commons.lang3.tuple.Pair; import org.junit.jupiter.api.BeforeEach; @@ -63,6 +64,7 @@ import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.setting.OpenSearchSettings; import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; +import org.opensearch.sql.planner.physical.NestedOperator; import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; @@ -110,10 +112,8 @@ public void testProtectIndexScan() { ImmutableMap.of(ref("name", STRING), ref("lastname", STRING)); Pair newEvalField = ImmutablePair.of(ref("name1", STRING), ref("name", STRING)); - Integer sortCount = 100; Pair sortField = ImmutablePair.of(DEFAULT_ASC, ref("name1", STRING)); - Integer size = 200; Integer limit = 10; Integer offset = 10; @@ -130,11 +130,10 @@ public void testProtectIndexScan() { PhysicalPlanDSL.agg( filter( resourceMonitor( - new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder(indexName, - maxResultWindow, - settings, - exprValueFactory))), + new OpenSearchIndexScan(client, settings, + indexName, + maxResultWindow, + exprValueFactory)), filterExpr), aggregators, groupByExprs), @@ -160,11 +159,10 @@ public void testProtectIndexScan() { PhysicalPlanDSL.rename( PhysicalPlanDSL.agg( filter( - new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder(indexName, - maxResultWindow, - settings, - exprValueFactory)), + new OpenSearchIndexScan(client, settings, + indexName, + maxResultWindow, + exprValueFactory), filterExpr), aggregators, groupByExprs), @@ -324,6 +322,21 @@ public void testVisitML() { executionProtector.visitML(mlOperator, null)); } + @Test + public void testVisitNested() { + Set args = Set.of("message.info"); + Map> groupedFieldsByPath = + Map.of("message", List.of("message.info")); + NestedOperator nestedOperator = + new NestedOperator( + values(emptyList()), + args, + groupedFieldsByPath); + + assertEquals(executionProtector.doProtect(nestedOperator), + executionProtector.visitNested(nestedOperator, values(emptyList()))); + } + @Test public void visitPaginate() { var paginate = new PaginateOperator(values(List.of()), 42); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java index 354d6e1d7a7..5cabe1930d0 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/ContinuePageRequestBuilderTest.java @@ -11,6 +11,7 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; +import java.util.List; import java.util.Map; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayNameGeneration; @@ -75,7 +76,11 @@ public void pushDown_not_supported() { () -> assertThrows(UnsupportedOperationException.class, () -> requestBuilder.pushDownProjects(mock())), () -> assertThrows(UnsupportedOperationException.class, - () -> requestBuilder.pushTypeMapping(mock())) + () -> requestBuilder.pushTypeMapping(mock())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownNested(List.of())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownTrackedScore(true)) ); } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java index 9d4c0b8dbe4..ef850380d42 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/InitialPageRequestBuilderTest.java @@ -14,6 +14,7 @@ import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; +import java.util.List; import java.util.Map; import java.util.Set; import org.junit.jupiter.api.BeforeEach; @@ -81,7 +82,11 @@ public void pushDown_not_supported() { () -> assertThrows(UnsupportedOperationException.class, () -> requestBuilder.pushDownLimit(1, 2)), () -> assertThrows(UnsupportedOperationException.class, - () -> requestBuilder.pushDownHighlight("", Map.of())) + () -> requestBuilder.pushDownHighlight("", Map.of())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownNested(List.of())), + () -> assertThrows(UnsupportedOperationException.class, + () -> requestBuilder.pushDownTrackedScore(true)) ); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java index c6a9a06a70d..adb2a16a844 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchQueryRequestTest.java @@ -16,6 +16,7 @@ import static org.mockito.Mockito.when; import static org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder.DEFAULT_QUERY_TIMEOUT; +import java.util.Iterator; import java.util.function.Consumer; import java.util.function.Function; import org.junit.jupiter.api.Test; @@ -29,6 +30,8 @@ import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; +import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.OpenSearchResponse; @@ -53,6 +56,12 @@ public class OpenSearchQueryRequestTest { @Mock private SearchHit searchHit; + @Mock + private SearchSourceBuilder sourceBuilder; + + @Mock + private FetchSourceContext fetchSourceContext; + @Mock private OpenSearchExprValueFactory factory; @@ -61,17 +70,69 @@ public class OpenSearchQueryRequestTest { @Test void search() { + OpenSearchQueryRequest request = new OpenSearchQueryRequest( + new OpenSearchRequest.IndexName("test"), + sourceBuilder, + factory + ); + + when(sourceBuilder.fetchSource()).thenReturn(fetchSourceContext); + when(fetchSourceContext.includes()).thenReturn(null); when(searchAction.apply(any())).thenReturn(searchResponse); when(searchResponse.getHits()).thenReturn(searchHits); when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(fetchSourceContext, times(1)).includes(); assertFalse(searchResponse.isEmpty()); searchResponse = request.search(searchAction, scrollAction); assertTrue(searchResponse.isEmpty()); verify(searchAction, times(1)).apply(any()); } + @Test + void search_withoutContext() { + OpenSearchQueryRequest request = new OpenSearchQueryRequest( + new OpenSearchRequest.IndexName("test"), + sourceBuilder, + factory + ); + + when(sourceBuilder.fetchSource()).thenReturn(null); + when(searchAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(sourceBuilder, times(1)).fetchSource(); + assertFalse(searchResponse.isEmpty()); + } + + @Test + void search_withIncludes() { + OpenSearchQueryRequest request = new OpenSearchQueryRequest( + new OpenSearchRequest.IndexName("test"), + sourceBuilder, + factory + ); + + String[] includes = {"_id", "_index"}; + when(sourceBuilder.fetchSource()).thenReturn(fetchSourceContext); + when(fetchSourceContext.includes()).thenReturn(includes); + when(searchAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(fetchSourceContext, times(2)).includes(); + assertFalse(searchResponse.isEmpty()); + + searchResponse = request.search(searchAction, scrollAction); + assertTrue(searchResponse.isEmpty()); + + verify(searchAction, times(1)).apply(any()); + } + @Test void clean() { request.clean(cleanAction); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java index 49283e61b95..94433c29b96 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchRequestBuilderTest.java @@ -9,15 +9,19 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; +import static org.opensearch.index.query.QueryBuilders.matchAllQuery; +import static org.opensearch.index.query.QueryBuilders.nestedQuery; import static org.opensearch.search.sort.FieldSortBuilder.DOC_FIELD_NAME; import static org.opensearch.search.sort.SortOrder.ASC; import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; +import static org.opensearch.sql.data.type.ExprCoreType.STRING; import java.util.Collections; import java.util.List; import java.util.Map; import java.util.Set; import org.apache.commons.lang3.tuple.Pair; +import org.apache.lucene.search.join.ScoreMode; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; @@ -26,24 +30,28 @@ import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.common.unit.TimeValue; +import org.opensearch.index.query.InnerHitBuilder; +import org.opensearch.index.query.NestedQueryBuilder; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.aggregations.AggregationBuilders; import org.opensearch.search.aggregations.bucket.composite.TermsValuesSourceBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; import org.opensearch.search.sort.FieldSortBuilder; import org.opensearch.search.sort.ScoreSortBuilder; import org.opensearch.search.sort.SortBuilders; import org.opensearch.sql.common.setting.Settings; -import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.response.agg.CompositeAggregationParser; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.response.agg.SingleValueParser; +import org.opensearch.sql.planner.logical.LogicalNested; @ExtendWith(MockitoExtension.class) @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) @@ -77,6 +85,7 @@ void build_query_request() { Integer limit = 200; Integer offset = 0; requestBuilder.pushDownLimit(limit, offset); + requestBuilder.pushDownTrackedScore(true); assertEquals( new OpenSearchQueryRequest( @@ -84,7 +93,8 @@ void build_query_request() { new SearchSourceBuilder() .from(offset) .size(limit) - .timeout(DEFAULT_QUERY_TIMEOUT), + .timeout(DEFAULT_QUERY_TIMEOUT) + .trackScores(true), exprValueFactory), requestBuilder.build()); } @@ -219,6 +229,106 @@ void test_push_down_project() { requestBuilder.getSourceBuilder()); } + @Test + void test_push_down_nested() { + List> args = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression("message.info", DSL.nested(DSL.ref("message.info", STRING)), null) + ); + + LogicalNested nested = new LogicalNested(null, args, projectList); + requestBuilder.pushDownNested(nested.getFields()); + + NestedQueryBuilder nestedQuery = nestedQuery("message", matchAllQuery(), ScoreMode.None) + .innerHit(new InnerHitBuilder().setFetchSourceContext( + new FetchSourceContext(true, new String[]{"message.info"}, null))); + + assertEquals( + new SearchSourceBuilder() + .query(QueryBuilders.boolQuery().filter(QueryBuilders.boolQuery().must(nestedQuery))) + .from(DEFAULT_OFFSET) + .size(DEFAULT_LIMIT) + .timeout(DEFAULT_QUERY_TIMEOUT), + requestBuilder.getSourceBuilder()); + } + + @Test + void test_push_down_multiple_nested_with_same_path() { + List> args = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ), + Map.of( + "field", new ReferenceExpression("message.from", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + List projectList = + List.of( + new NamedExpression("message.info", DSL.nested(DSL.ref("message.info", STRING)), null), + new NamedExpression("message.from", DSL.nested(DSL.ref("message.from", STRING)), null) + ); + + LogicalNested nested = new LogicalNested(null, args, projectList); + requestBuilder.pushDownNested(nested.getFields()); + + NestedQueryBuilder nestedQuery = nestedQuery("message", matchAllQuery(), ScoreMode.None) + .innerHit(new InnerHitBuilder().setFetchSourceContext( + new FetchSourceContext(true, new String[]{"message.info", "message.from"}, null))); + assertEquals( + new SearchSourceBuilder() + .query(QueryBuilders.boolQuery().filter(QueryBuilders.boolQuery().must(nestedQuery))) + .from(DEFAULT_OFFSET) + .size(DEFAULT_LIMIT) + .timeout(DEFAULT_QUERY_TIMEOUT), + requestBuilder.getSourceBuilder()); + } + + @Test + void test_push_down_nested_with_filter() { + List> args = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression("message.info", DSL.nested(DSL.ref("message.info", STRING)), null) + ); + + LogicalNested nested = new LogicalNested(null, args, projectList); + requestBuilder.getSourceBuilder().query(QueryBuilders.rangeQuery("myNum").gt(3)); + requestBuilder.pushDownNested(nested.getFields()); + + NestedQueryBuilder nestedQuery = nestedQuery("message", matchAllQuery(), ScoreMode.None) + .innerHit(new InnerHitBuilder().setFetchSourceContext( + new FetchSourceContext(true, new String[]{"message.info"}, null))); + + assertEquals( + new SearchSourceBuilder() + .query( + QueryBuilders.boolQuery().filter( + QueryBuilders.boolQuery() + .must(QueryBuilders.rangeQuery("myNum").gt(3)) + .must(nestedQuery) + ) + ) + .from(DEFAULT_OFFSET) + .size(DEFAULT_LIMIT) + .timeout(DEFAULT_QUERY_TIMEOUT), + requestBuilder.getSourceBuilder()); + } + @Test void test_push_type_mapping() { Map typeMapping = Map.of("intA", OpenSearchDataType.of(INTEGER)); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java index 3ad6ad226db..d2cfc89d830 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/request/OpenSearchScrollRequestTest.java @@ -11,10 +11,14 @@ import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.fail; +import static org.mockito.ArgumentMatchers.any; import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import java.util.concurrent.atomic.AtomicBoolean; +import java.util.function.Function; import org.apache.lucene.search.TotalHits; import org.junit.jupiter.api.DisplayNameGeneration; import org.junit.jupiter.api.DisplayNameGenerator; @@ -30,12 +34,34 @@ import org.opensearch.search.SearchHit; import org.opensearch.search.SearchHits; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.fetch.subphase.FetchSourceContext; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; +import org.opensearch.sql.opensearch.response.OpenSearchResponse; @ExtendWith(MockitoExtension.class) @DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class OpenSearchScrollRequestTest { + @Mock + private Function searchAction; + + @Mock + private Function scrollAction; + + @Mock + private SearchResponse searchResponse; + + @Mock + private SearchHits searchHits; + + @Mock + private SearchHit searchHit; + + @Mock + private SearchSourceBuilder sourceBuilder; + + @Mock + private FetchSourceContext fetchSourceContext; @Mock private OpenSearchExprValueFactory factory; @@ -76,6 +102,66 @@ void scrollRequest() { request.scrollRequest()); } + @Test + void search() { + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), + TimeValue.timeValueMinutes(1), + sourceBuilder, + factory + ); + + String[] includes = {"_id", "_index"}; + when(sourceBuilder.fetchSource()).thenReturn(fetchSourceContext); + when(fetchSourceContext.includes()).thenReturn(includes); + when(searchAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(fetchSourceContext, times(2)).includes(); + assertFalse(searchResponse.isEmpty()); + } + + @Test + void search_withoutContext() { + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), + TimeValue.timeValueMinutes(1), + sourceBuilder, + factory + ); + + when(sourceBuilder.fetchSource()).thenReturn(null); + when(searchAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[] {searchHit}); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(sourceBuilder, times(1)).fetchSource(); + assertFalse(searchResponse.isEmpty()); + } + + @Test + void search_withoutIncludes() { + OpenSearchScrollRequest request = new OpenSearchScrollRequest( + new OpenSearchRequest.IndexName("test"), + TimeValue.timeValueMinutes(1), + sourceBuilder, + factory + ); + + when(sourceBuilder.fetchSource()).thenReturn(fetchSourceContext); + when(fetchSourceContext.includes()).thenReturn(null); + when(searchAction.apply(any())).thenReturn(searchResponse); + when(searchResponse.getHits()).thenReturn(searchHits); + when(searchHits.getHits()).thenReturn(new SearchHit[]{searchHit}); + + OpenSearchResponse searchResponse = request.search(searchAction, scrollAction); + verify(fetchSourceContext, times(1)).includes(); + assertFalse(searchResponse.isEmpty()); + } + @Test void toCursor() { request.setScrollId("scroll123"); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java index 2d1d6145f3f..8add6c8c856 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/response/OpenSearchResponseTest.java @@ -17,6 +17,7 @@ import com.google.common.collect.ImmutableMap; import java.util.Arrays; +import java.util.List; import java.util.Map; import java.util.stream.Collectors; import org.apache.lucene.search.TotalHits; @@ -31,7 +32,10 @@ import org.opensearch.search.SearchHits; import org.opensearch.search.aggregations.Aggregations; import org.opensearch.search.fetch.subphase.highlight.HighlightField; +import org.opensearch.sql.data.model.ExprFloatValue; import org.opensearch.sql.data.model.ExprIntegerValue; +import org.opensearch.sql.data.model.ExprLongValue; +import org.opensearch.sql.data.model.ExprStringValue; import org.opensearch.sql.data.model.ExprTupleValue; import org.opensearch.sql.data.model.ExprValue; import org.opensearch.sql.data.model.ExprValueUtils; @@ -56,6 +60,8 @@ class OpenSearchResponseTest { @Mock private Aggregations aggregations; + private List includes = List.of(); + @Mock private OpenSearchAggregationResponseParser parser; @@ -74,27 +80,27 @@ void isEmpty() { new TotalHits(2L, TotalHits.Relation.EQUAL_TO), 1.0F)); - var response = new OpenSearchResponse(searchResponse, factory); + var response = new OpenSearchResponse(searchResponse, factory, includes); assertFalse(response.isEmpty()); assertEquals(2L, response.getTotalHits()); when(searchResponse.getHits()).thenReturn(SearchHits.empty()); when(searchResponse.getAggregations()).thenReturn(null); - response = new OpenSearchResponse(searchResponse, factory); + response = new OpenSearchResponse(searchResponse, factory, includes); assertTrue(response.isEmpty()); assertEquals(0L, response.getTotalHits()); when(searchResponse.getHits()) .thenReturn(new SearchHits(null, new TotalHits(0, TotalHits.Relation.EQUAL_TO), 0)); - response = new OpenSearchResponse(searchResponse, factory); + response = new OpenSearchResponse(searchResponse, factory, includes); assertTrue(response.isEmpty()); assertEquals(0L, response.getTotalHits()); when(searchResponse.getHits()).thenReturn(SearchHits.empty()); when(searchResponse.getAggregations()).thenReturn(new Aggregations(emptyList())); - response = new OpenSearchResponse(searchResponse, factory); + response = new OpenSearchResponse(searchResponse, factory, includes); assertFalse(response.isEmpty()); assertEquals(0L, response.getTotalHits()); } @@ -110,28 +116,168 @@ void iterator() { when(searchHit1.getSourceAsString()).thenReturn("{\"id1\", 1}"); when(searchHit2.getSourceAsString()).thenReturn("{\"id1\", 2}"); + when(searchHit1.getInnerHits()).thenReturn(null); + when(searchHit2.getInnerHits()).thenReturn(null); when(factory.construct(any())).thenReturn(exprTupleValue1).thenReturn(exprTupleValue2); int i = 0; - var response = new OpenSearchResponse(searchResponse, factory); - for (ExprValue hit : response) { + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, List.of("id1"))) { if (i == 0) { - assertEquals(exprTupleValue1, hit); + assertEquals(exprTupleValue1.tupleValue().get("id"), hit.tupleValue().get("id")); } else if (i == 1) { - assertEquals(exprTupleValue2, hit); + assertEquals(exprTupleValue2.tupleValue().get("id"), hit.tupleValue().get("id")); + } else { + fail("More search hits returned than expected"); + } + i++; + } + } + + @Test + void iterator_metafields() { + + ExprTupleValue exprTupleHit = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1) + )); + + when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit1}, + new TotalHits(1L, TotalHits.Relation.EQUAL_TO), + 3.75F)); + + when(searchHit1.getSourceAsString()).thenReturn("{\"id1\", 1}"); + when(searchHit1.getId()).thenReturn("testId"); + when(searchHit1.getIndex()).thenReturn("testIndex"); + when(searchHit1.getScore()).thenReturn(3.75F); + when(searchHit1.getSeqNo()).thenReturn(123456L); + + when(factory.construct(any())).thenReturn(exprTupleHit); + + ExprTupleValue exprTupleResponse = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1), + "_index", new ExprStringValue("testIndex"), + "_id", new ExprStringValue("testId"), + "_sort", new ExprLongValue(123456L), + "_score", new ExprFloatValue(3.75F), + "_maxscore", new ExprFloatValue(3.75F) + )); + List includes = List.of("id1", "_index", "_id", "_sort", "_score", "_maxscore"); + int i = 0; + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, includes)) { + if (i == 0) { + assertEquals(exprTupleResponse, hit); } else { fail("More search hits returned than expected"); } i++; } - assertEquals(2L, response.getTotalHits()); + } + + @Test + void iterator_metafields_withoutIncludes() { + + ExprTupleValue exprTupleHit = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1) + )); + + when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit1}, + new TotalHits(1L, TotalHits.Relation.EQUAL_TO), + 3.75F)); + + when(searchHit1.getSourceAsString()).thenReturn("{\"id1\", 1}"); + + when(factory.construct(any())).thenReturn(exprTupleHit); + + List includes = List.of("id1"); + ExprTupleValue exprTupleResponse = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1) + )); + int i = 0; + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, includes)) { + if (i == 0) { + assertEquals(exprTupleResponse, hit); + } else { + fail("More search hits returned than expected"); + } + i++; + } + } + + @Test + void iterator_metafields_scoreNaN() { + + ExprTupleValue exprTupleHit = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1) + )); + + when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit1}, + new TotalHits(1L, TotalHits.Relation.EQUAL_TO), + Float.NaN)); + + when(searchHit1.getSourceAsString()).thenReturn("{\"id1\", 1}"); + when(searchHit1.getId()).thenReturn("testId"); + when(searchHit1.getIndex()).thenReturn("testIndex"); + when(searchHit1.getScore()).thenReturn(Float.NaN); + when(searchHit1.getSeqNo()).thenReturn(123456L); + + when(factory.construct(any())).thenReturn(exprTupleHit); + + List includes = List.of("id1", "_index", "_id", "_sort", "_score", "_maxscore"); + ExprTupleValue exprTupleResponse = ExprTupleValue.fromExprValueMap(ImmutableMap.of( + "id1", new ExprIntegerValue(1), + "_index", new ExprStringValue("testIndex"), + "_id", new ExprStringValue("testId"), + "_sort", new ExprLongValue(123456L) + )); + int i = 0; + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, includes)) { + if (i == 0) { + assertEquals(exprTupleResponse, hit); + } else { + fail("More search hits returned than expected"); + } + i++; + } + } + + @Test + void iterator_with_inner_hits() { + when(searchResponse.getHits()) + .thenReturn( + new SearchHits( + new SearchHit[] {searchHit1}, + new TotalHits(2L, TotalHits.Relation.EQUAL_TO), + 1.0F)); + when(searchHit1.getSourceAsString()).thenReturn("{\"id1\", 1}"); + when(searchHit1.getSourceAsMap()).thenReturn(Map.of("id1", 1)); + when(searchHit1.getInnerHits()).thenReturn( + Map.of( + "innerHit", + new SearchHits( + new SearchHit[] {searchHit1}, + new TotalHits(2L, TotalHits.Relation.EQUAL_TO), + 1.0F))); + + when(factory.construct(any())).thenReturn(exprTupleValue1); + + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, includes)) { + assertEquals(exprTupleValue1, hit); + } } @Test void response_is_aggregation_when_aggregation_not_empty() { when(searchResponse.getAggregations()).thenReturn(aggregations); - OpenSearchResponse response = new OpenSearchResponse(searchResponse, factory); + OpenSearchResponse response = new OpenSearchResponse(searchResponse, factory, includes); assertTrue(response.isAggregationResponse()); } @@ -139,12 +285,14 @@ void response_is_aggregation_when_aggregation_not_empty() { void response_isnot_aggregation_when_aggregation_is_empty() { when(searchResponse.getAggregations()).thenReturn(null); - OpenSearchResponse response = new OpenSearchResponse(searchResponse, factory); + OpenSearchResponse response = new OpenSearchResponse(searchResponse, factory, includes); assertFalse(response.isAggregationResponse()); } @Test void aggregation_iterator() { + final List includes = List.of("id1", "id2"); + when(parser.parse(any())) .thenReturn(Arrays.asList(ImmutableMap.of("id1", 1), ImmutableMap.of("id2", 2))); when(searchResponse.getAggregations()).thenReturn(aggregations); @@ -154,7 +302,7 @@ void aggregation_iterator() { .thenReturn(new ExprIntegerValue(2)); int i = 0; - for (ExprValue hit : new OpenSearchResponse(searchResponse, factory)) { + for (ExprValue hit : new OpenSearchResponse(searchResponse, factory, includes)) { if (i == 0) { assertEquals(exprTupleValue1, hit); } else if (i == 1) { @@ -187,7 +335,7 @@ void highlight_iterator() { when(searchHit1.getHighlightFields()).thenReturn(highlightMap); when(factory.construct(any())).thenReturn(resultTuple); - for (ExprValue resultHit : new OpenSearchResponse(searchResponse, factory)) { + for (ExprValue resultHit : new OpenSearchResponse(searchResponse, factory, includes)) { var expected = ExprValueUtils.collectionValue( Arrays.stream(searchHit.getHighlightFields().get("highlights").getFragments()) .map(t -> (t.toString())).collect(Collectors.toList())); diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java index 7181bd5e565..2ff1de862b1 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/OpenSearchIndexTest.java @@ -191,6 +191,21 @@ void checkCacheUsedForFieldMappings() { hasEntry("name", OpenSearchDataType.of(STRING)))); } + @Test + void getReservedFieldTypes() { + Map fieldTypes = index.getReservedFieldTypes(); + assertThat( + fieldTypes, + allOf( + aMapWithSize(5), + hasEntry("_id", ExprCoreType.STRING), + hasEntry("_index", ExprCoreType.STRING), + hasEntry("_sort", ExprCoreType.LONG), + hasEntry("_score", ExprCoreType.FLOAT), + hasEntry("_maxscore", ExprCoreType.FLOAT) + )); + } + @Test void implementRelationOperatorOnly() { when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(200); @@ -200,10 +215,8 @@ void implementRelationOperatorOnly() { LogicalPlan plan = index.createScanBuilder(); Integer maxResultWindow = index.getMaxResultWindow(); - OpenSearchRequestBuilder - builder = new OpenSearchRequestBuilder(indexName, maxResultWindow, - settings, exprValueFactory); - assertEquals(new OpenSearchIndexScan(client, builder), index.implement(plan)); + assertEquals(new OpenSearchIndexScan(client, settings, indexName, + maxResultWindow, exprValueFactory), index.implement(index.optimize(plan))); } @Test @@ -229,12 +242,8 @@ void implementRelationOperatorWithOptimization() { LogicalPlan plan = index.createScanBuilder(); Integer maxResultWindow = index.getMaxResultWindow(); - OpenSearchRequestBuilder - builder = new OpenSearchRequestBuilder(indexName, maxResultWindow, - settings, exprValueFactory); - assertEquals( - new OpenSearchIndexScan(client, builder), - index.implement(index.optimize(plan))); + assertEquals(new OpenSearchIndexScan(client, settings, indexName, + maxResultWindow, exprValueFactory), index.implement(plan)); } @Test @@ -283,10 +292,8 @@ void implementOtherLogicalOperators() { PhysicalPlanDSL.eval( PhysicalPlanDSL.remove( PhysicalPlanDSL.rename( - new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder( - indexName, maxResultWindow, - settings, exprValueFactory)), + new OpenSearchIndexScan(client, settings, indexName, + maxResultWindow, exprValueFactory), mappings), exclude), newEvalField), diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java index 5c125ebc65a..bde940a939e 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanOptimizationTest.java @@ -7,6 +7,7 @@ package org.opensearch.sql.opensearch.storage.scan; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.reset; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; @@ -22,6 +23,7 @@ import static org.opensearch.sql.planner.logical.LogicalPlanDSL.filter; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.highlight; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.limit; +import static org.opensearch.sql.planner.logical.LogicalPlanDSL.nested; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.project; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.relation; import static org.opensearch.sql.planner.logical.LogicalPlanDSL.sort; @@ -29,13 +31,16 @@ import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_FILTER; import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_HIGHLIGHT; import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_LIMIT; +import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_NESTED; import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_PROJECT; import static org.opensearch.sql.planner.optimizer.rule.read.TableScanPushDown.PUSH_DOWN_SORT; import com.google.common.collect.ImmutableList; +import com.google.common.collect.ImmutableMap; import java.util.Arrays; import java.util.Collections; import java.util.HashSet; +import java.util.LinkedHashMap; import java.util.List; import java.util.Map; import java.util.stream.Collectors; @@ -48,6 +53,7 @@ import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; +import org.opensearch.index.query.SpanOrQueryBuilder; import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.aggregations.AggregationBuilders; import org.opensearch.search.aggregations.bucket.composite.CompositeAggregationBuilder; @@ -57,16 +63,24 @@ import org.opensearch.search.sort.SortOrder; import org.opensearch.sql.ast.expression.Literal; import org.opensearch.sql.ast.tree.Sort.SortOption; +import org.opensearch.sql.data.model.ExprTupleValue; +import org.opensearch.sql.data.model.ExprValueUtils; +import org.opensearch.sql.data.type.ExprCoreType; import org.opensearch.sql.data.type.ExprType; import org.opensearch.sql.expression.DSL; +import org.opensearch.sql.expression.FunctionExpression; import org.opensearch.sql.expression.HighlightExpression; +import org.opensearch.sql.expression.NamedExpression; import org.opensearch.sql.expression.ReferenceExpression; +import org.opensearch.sql.expression.function.OpenSearchFunctions; import org.opensearch.sql.opensearch.data.type.OpenSearchDataType; import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.response.agg.CompositeAggregationParser; import org.opensearch.sql.opensearch.response.agg.OpenSearchAggregationResponseParser; import org.opensearch.sql.opensearch.response.agg.SingleValueParser; import org.opensearch.sql.opensearch.storage.script.aggregation.AggregationQueryBuilder; +import org.opensearch.sql.planner.logical.LogicalFilter; +import org.opensearch.sql.planner.logical.LogicalNested; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.optimizer.LogicalPlanOptimizer; import org.opensearch.sql.planner.optimizer.rule.read.CreateTableScanBuilder; @@ -133,6 +147,119 @@ void test_filter_push_down() { ); } + /** + * SELECT intV as i FROM schema WHERE query_string(["intV^1.5", "QUERY", boost=12.5). + */ + @Test + void test_filter_on_opensearchfunction_with_trackedscores_push_down() { + LogicalPlan expectedPlan = + project( + indexScanBuilder( + withFilterPushedDown( + QueryBuilders.queryStringQuery("QUERY") + .field("intV", 1.5F) + .boost(12.5F) + ), + withTrackedScoresPushedDown(true) + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + FunctionExpression queryString = DSL.query_string( + DSL.namedArgument("fields", DSL.literal( + new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( + "intV", ExprValueUtils.floatValue(1.5F)))))), + DSL.namedArgument("query", "QUERY"), + DSL.namedArgument("boost", "12.5")); + + ((OpenSearchFunctions.OpenSearchFunction) queryString).setScoreTracked(true); + + LogicalPlan logicalPlan = project( + filter( + relation("schema", table), + queryString + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + assertEqualsAfterOptimization(expectedPlan, logicalPlan); + } + + @Test + void test_filter_on_multiple_opensearchfunctions_with_trackedscores_push_down() { + LogicalPlan expectedPlan = + project( + indexScanBuilder( + withFilterPushedDown( + QueryBuilders.boolQuery() + .should( + QueryBuilders.queryStringQuery("QUERY") + .field("intV", 1.5F) + .boost(12.5F)) + .should( + QueryBuilders.queryStringQuery("QUERY") + .field("intV", 1.5F) + .boost(12.5F) + ) + ), + withTrackedScoresPushedDown(true) + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + FunctionExpression firstQueryString = DSL.query_string( + DSL.namedArgument("fields", DSL.literal( + new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( + "intV", ExprValueUtils.floatValue(1.5F)))))), + DSL.namedArgument("query", "QUERY"), + DSL.namedArgument("boost", "12.5")); + ((OpenSearchFunctions.OpenSearchFunction) firstQueryString).setScoreTracked(false); + FunctionExpression secondQueryString = DSL.query_string( + DSL.namedArgument("fields", DSL.literal( + new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( + "intV", ExprValueUtils.floatValue(1.5F)))))), + DSL.namedArgument("query", "QUERY"), + DSL.namedArgument("boost", "12.5")); + ((OpenSearchFunctions.OpenSearchFunction) secondQueryString).setScoreTracked(true); + + LogicalPlan logicalPlan = project( + filter( + relation("schema", table), + DSL.or(firstQueryString, secondQueryString) + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + assertEqualsAfterOptimization(expectedPlan, logicalPlan); + } + + @Test + void test_filter_on_opensearchfunction_without_trackedscores_push_down() { + LogicalPlan expectedPlan = + project( + indexScanBuilder( + withFilterPushedDown( + QueryBuilders.queryStringQuery("QUERY") + .field("intV", 1.5F) + .boost(12.5F) + ), + withTrackedScoresPushedDown(false) + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + FunctionExpression queryString = DSL.query_string( + DSL.namedArgument("fields", DSL.literal( + new ExprTupleValue(new LinkedHashMap<>(ImmutableMap.of( + "intV", ExprValueUtils.floatValue(1.5F)))))), + DSL.namedArgument("query", "QUERY"), + DSL.namedArgument("boost", "12.5")); + + LogicalPlan logicalPlan = project( + filter( + relation("schema", table), + queryString + ), + DSL.named("i", DSL.ref("intV", INTEGER)) + ); + assertEqualsAfterOptimization(expectedPlan, logicalPlan); + } + /** * SELECT avg(intV) FROM schema GROUP BY string_value. */ @@ -209,6 +336,21 @@ void test_sort_push_down() { ); } + @Test + void test_score_sort_push_down() { + assertEqualsAfterOptimization( + indexScanBuilder( + withSortPushedDown( + SortBuilders.scoreSort().order(SortOrder.ASC) + ) + ), + sort( + relation("schema", table), + Pair.of(SortOption.DEFAULT_ASC, DSL.ref("_score", INTEGER)) + ) + ); + } + @Test void test_limit_push_down() { assertEqualsAfterOptimization( @@ -245,6 +387,39 @@ void test_highlight_push_down() { ); } + @Test + void test_nested_push_down() { + List> args = List.of( + Map.of( + "field", new ReferenceExpression("message.info", STRING), + "path", new ReferenceExpression("message", STRING) + ) + ); + + List projectList = + List.of( + new NamedExpression("message.info", DSL.nested(DSL.ref("message.info", STRING)), null) + ); + + LogicalNested nested = new LogicalNested(null, args, projectList); + + assertEqualsAfterOptimization( + project( + nested( + indexScanBuilder( + withNestedPushedDown(nested.getFields())), args, projectList), + DSL.named("message.info", + DSL.nested(DSL.ref("message.info", STRING))) + ), + project( + nested( + relation("schema", table), args, projectList), + DSL.named("message.info", + DSL.nested(DSL.ref("message.info", STRING))) + ) + ); + } + /** * SELECT avg(intV) FROM schema WHERE intV = 1 GROUP BY string_value. */ @@ -576,6 +751,14 @@ private Runnable withHighlightPushedDown(String field, Map argu return () -> verify(requestBuilder, times(1)).pushDownHighlight(field, arguments); } + private Runnable withNestedPushedDown(List> fields) { + return () -> verify(requestBuilder, times(1)).pushDownNested(fields); + } + + private Runnable withTrackedScoresPushedDown(boolean trackScores) { + return () -> verify(requestBuilder, times(1)).pushDownTrackedScore(trackScores); + } + private static AggregationAssertHelper.AggregationAssertHelperBuilder aggregate(String aggName) { var aggBuilder = new AggregationAssertHelper.AggregationAssertHelperBuilder(); aggBuilder.aggregateName = aggName; @@ -606,6 +789,7 @@ private LogicalPlan optimize(LogicalPlan plan) { PUSH_DOWN_SORT, PUSH_DOWN_LIMIT, PUSH_DOWN_HIGHLIGHT, + PUSH_DOWN_NESTED, PUSH_DOWN_PROJECT)); return optimizer.optimize(plan); } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java index c133897ca25..c788e78f1a4 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/storage/scan/OpenSearchIndexScanTest.java @@ -75,8 +75,8 @@ void setup() { @Test void query_empty_result() { mockResponse(client); - try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("test", 3, settings, exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "test", 3, exprValueFactory)) { indexScan.open(); assertAll( () -> assertFalse(indexScan.hasNext()), @@ -93,8 +93,8 @@ void query_all_results_with_query() { employee(2, "Smith", "HR"), employee(3, "Allen", "IT")}); - try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "employees", 10, exprValueFactory)) { indexScan.open(); assertAll( @@ -119,10 +119,9 @@ void query_all_results_with_scroll() { mockResponse(client, new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT")}); - //when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(2); - try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "employees", 10, exprValueFactory)) { indexScan.open(); assertAll( @@ -150,8 +149,8 @@ void query_some_results_with_query() { employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); - try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "employees", 10, exprValueFactory)) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); @@ -178,9 +177,8 @@ void query_some_results_with_scroll() { new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, new OpenSearchRequestBuilder("employees", 2, settings, - exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "employees", 2, exprValueFactory)) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); @@ -210,8 +208,8 @@ void query_results_limited_by_query_size() { employee(4, "Bob", "HR")}); when(settings.getSettingValue(Settings.Key.QUERY_SIZE_LIMIT)).thenReturn(2); - try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("employees", 10, settings, exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "employees", 10, exprValueFactory)) { indexScan.open(); assertAll( @@ -278,9 +276,8 @@ void push_down_highlight_with_repeating_fields() { new ExprValue[]{employee(1, "John", "IT"), employee(2, "Smith", "HR")}, new ExprValue[]{employee(3, "Allen", "IT"), employee(4, "Bob", "HR")}); - try (OpenSearchIndexScan indexScan = - new OpenSearchIndexScan(client, new OpenSearchRequestBuilder("test", 2, settings, - exprValueFactory))) { + try (OpenSearchIndexScan indexScan = new OpenSearchIndexScan(client, settings, + "test", 2, exprValueFactory)) { indexScan.getRequestBuilder().pushDownLimit(3, 0); indexScan.open(); Map args = new HashMap<>(); @@ -306,9 +303,8 @@ public PushDownAssertion(OpenSearchClient client, OpenSearchExprValueFactory valueFactory, Settings settings) { this.client = client; - this.indexScan = new OpenSearchIndexScan(client, - new OpenSearchRequestBuilder("test", 10000, - settings, valueFactory)); + this.indexScan = new OpenSearchIndexScan(client, settings, + "test", 10000, valueFactory); this.response = mock(OpenSearchResponse.class); this.factory = valueFactory; when(response.isEmpty()).thenReturn(true); diff --git a/plugin/build.gradle b/plugin/build.gradle index cb9ab64d7be..4a6c175d614 100644 --- a/plugin/build.gradle +++ b/plugin/build.gradle @@ -122,14 +122,12 @@ dependencies { api "com.fasterxml.jackson.core:jackson-core:${versions.jackson}" api "com.fasterxml.jackson.core:jackson-databind:${versions.jackson_databind}" api "com.fasterxml.jackson.core:jackson-annotations:${versions.jackson}" - api group: 'commons-io', name: 'commons-io', version: '2.8.0' - implementation group: 'org.opensearch', name: 'opensearch-x-content', version: "${opensearch_version}" - implementation group: 'org.opensearch', name: 'common-utils', version: "${opensearch_build}" api project(":ppl") api project(':legacy') api project(':opensearch') api project(':prometheus') + api project(':datasources') testImplementation group: 'net.bytebuddy', name: 'byte-buddy-agent', version: '1.12.13' testImplementation group: 'org.hamcrest', name: 'hamcrest-library', version: '2.1' diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java index 3d733233be5..fcb66e2e434 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/SQLPlugin.java @@ -46,11 +46,23 @@ import org.opensearch.script.ScriptContext; import org.opensearch.script.ScriptEngine; import org.opensearch.script.ScriptService; -import org.opensearch.sql.common.encryptor.EncryptorImpl; -import org.opensearch.sql.datasource.DataSourceMetadataStorage; import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; -import org.opensearch.sql.datasource.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasources.auth.DataSourceUserAuthorizationHelper; +import org.opensearch.sql.datasources.auth.DataSourceUserAuthorizationHelperImpl; +import org.opensearch.sql.datasources.encryptor.EncryptorImpl; +import org.opensearch.sql.datasources.model.transport.CreateDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.DeleteDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.GetDataSourceActionResponse; +import org.opensearch.sql.datasources.model.transport.UpdateDataSourceActionResponse; +import org.opensearch.sql.datasources.rest.RestDataSourceQueryAction; +import org.opensearch.sql.datasources.service.DataSourceMetadataStorage; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; +import org.opensearch.sql.datasources.settings.DataSourceSettings; +import org.opensearch.sql.datasources.storage.OpenSearchDataSourceMetadataStorage; +import org.opensearch.sql.datasources.transport.TransportCreateDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportDeleteDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportGetDataSourceAction; +import org.opensearch.sql.datasources.transport.TransportUpdateDataSourceAction; import org.opensearch.sql.legacy.esdomain.LocalClusterState; import org.opensearch.sql.legacy.executor.AsyncRestExecutor; import org.opensearch.sql.legacy.metrics.Metrics; @@ -63,18 +75,12 @@ import org.opensearch.sql.opensearch.storage.script.ExpressionScriptEngine; import org.opensearch.sql.opensearch.storage.serialization.DefaultExpressionSerializer; import org.opensearch.sql.plugin.config.OpenSearchPluginModule; -import org.opensearch.sql.plugin.datasource.DataSourceSettings; -import org.opensearch.sql.plugin.datasource.DataSourceUserAuthorizationHelperImpl; -import org.opensearch.sql.plugin.datasource.OpenSearchDataSourceMetadataStorage; -import org.opensearch.sql.plugin.model.CreateDataSourceActionResponse; -import org.opensearch.sql.plugin.rest.RestDataSourceQueryAction; import org.opensearch.sql.plugin.rest.RestPPLQueryAction; import org.opensearch.sql.plugin.rest.RestPPLStatsAction; import org.opensearch.sql.plugin.rest.RestQuerySettingsAction; import org.opensearch.sql.plugin.transport.PPLQueryAction; import org.opensearch.sql.plugin.transport.TransportPPLQueryAction; import org.opensearch.sql.plugin.transport.TransportPPLQueryResponse; -import org.opensearch.sql.plugin.transport.datasource.TransportCreateDataSourceAction; import org.opensearch.sql.prometheus.storage.PrometheusStorageFactory; import org.opensearch.sql.storage.DataSourceFactory; import org.opensearch.threadpool.ExecutorBuilder; @@ -91,7 +97,7 @@ public class SQLPlugin extends Plugin implements ActionPlugin, ScriptPlugin { */ private org.opensearch.sql.common.setting.Settings pluginSettings; private NodeClient client; - private DataSourceService dataSourceService; + private DataSourceServiceImpl dataSourceService; private Injector injector; public String name() { @@ -136,7 +142,13 @@ public List getRestHandlers( new ActionType<>(PPLQueryAction.NAME, TransportPPLQueryResponse::new), TransportPPLQueryAction.class), new ActionHandler<>(new ActionType<>(TransportCreateDataSourceAction.NAME, - CreateDataSourceActionResponse::new), TransportCreateDataSourceAction.class)); + CreateDataSourceActionResponse::new), TransportCreateDataSourceAction.class), + new ActionHandler<>(new ActionType<>(TransportGetDataSourceAction.NAME, + GetDataSourceActionResponse::new), TransportGetDataSourceAction.class), + new ActionHandler<>(new ActionType<>(TransportUpdateDataSourceAction.NAME, + UpdateDataSourceActionResponse::new), TransportUpdateDataSourceAction.class), + new ActionHandler<>(new ActionType<>(TransportDeleteDataSourceAction.NAME, + DeleteDataSourceActionResponse::new), TransportDeleteDataSourceAction.class)); } @Override @@ -155,22 +167,7 @@ public Collection createComponents( this.clusterService = clusterService; this.pluginSettings = new OpenSearchSettings(clusterService.getClusterSettings()); this.client = (NodeClient) client; - String masterKey = DataSourceSettings - .DATASOURCE_MASTER_SECRET_KEY.get(clusterService.getSettings()); - DataSourceMetadataStorage dataSourceMetadataStorage - = new OpenSearchDataSourceMetadataStorage(client, clusterService, - new EncryptorImpl(masterKey)); - DataSourceUserAuthorizationHelper dataSourceUserAuthorizationHelper - = new DataSourceUserAuthorizationHelperImpl(client); - this.dataSourceService = - new DataSourceServiceImpl( - new ImmutableSet.Builder() - .add(new OpenSearchDataSourceFactory( - new OpenSearchNodeClient(this.client), pluginSettings)) - .add(new PrometheusStorageFactory()) - .build(), - dataSourceMetadataStorage, - dataSourceUserAuthorizationHelper); + this.dataSourceService = createDataSourceService(); dataSourceService.createDataSource(defaultOpenSearchDataSourceMetadata()); LocalClusterState.state().setClusterService(clusterService); LocalClusterState.state().setPluginSettings((OpenSearchSettings) pluginSettings); @@ -213,4 +210,22 @@ public ScriptEngine getScriptEngine(Settings settings, Collection() + .add(new OpenSearchDataSourceFactory( + new OpenSearchNodeClient(this.client), pluginSettings)) + .add(new PrometheusStorageFactory()) + .build(), + dataSourceMetadataStorage, + dataSourceUserAuthorizationHelper); + } + } diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/rest/RestDataSourceQueryAction.java b/plugin/src/main/java/org/opensearch/sql/plugin/rest/RestDataSourceQueryAction.java deleted file mode 100644 index 7314362e396..00000000000 --- a/plugin/src/main/java/org/opensearch/sql/plugin/rest/RestDataSourceQueryAction.java +++ /dev/null @@ -1,131 +0,0 @@ -/* - * - * * Copyright OpenSearch Contributors - * * SPDX-License-Identifier: Apache-2.0 - * - */ - -package org.opensearch.sql.plugin.rest; - -import static org.opensearch.rest.RestRequest.Method.POST; -import static org.opensearch.rest.RestStatus.BAD_REQUEST; -import static org.opensearch.rest.RestStatus.SERVICE_UNAVAILABLE; -import static org.opensearch.sql.plugin.utils.Scheduler.schedule; - -import com.google.common.collect.ImmutableList; -import java.io.IOException; -import java.util.List; -import org.apache.logging.log4j.LogManager; -import org.apache.logging.log4j.Logger; -import org.opensearch.action.ActionListener; -import org.opensearch.client.node.NodeClient; -import org.opensearch.index.IndexNotFoundException; -import org.opensearch.rest.BaseRestHandler; -import org.opensearch.rest.BytesRestResponse; -import org.opensearch.rest.RestChannel; -import org.opensearch.rest.RestRequest; -import org.opensearch.rest.RestStatus; -import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.legacy.metrics.MetricName; -import org.opensearch.sql.legacy.metrics.Metrics; -import org.opensearch.sql.opensearch.response.error.ErrorMessageFactory; -import org.opensearch.sql.plugin.model.CreateDataSourceActionRequest; -import org.opensearch.sql.plugin.model.CreateDataSourceActionResponse; -import org.opensearch.sql.plugin.transport.datasource.TransportCreateDataSourceAction; -import org.opensearch.sql.plugin.utils.XContentParserUtils; - -public class RestDataSourceQueryAction extends BaseRestHandler { - - public static final String DATASOURCE_ACTIONS = "datasource_actions"; - public static final String BASE_DATASOURCE_ACTION_URL = "/_plugins/_query/_datasources"; - - private static final Logger LOG = LogManager.getLogger(RestDataSourceQueryAction.class); - - @Override - public String getName() { - return DATASOURCE_ACTIONS; - } - - @Override - public List routes() { - return ImmutableList.of( - - /* - * - * Create a new datasource. - * Request URL: POST - * Request body: - * Ref [org.opensearch.sql.plugin.transport.datasource.model.CreateDataSourceActionRequest] - * Response body: - * Ref [org.opensearch.sql.plugin.transport.datasource.model.CreateDataSourceActionResponse] - */ - new Route(POST, BASE_DATASOURCE_ACTION_URL) - ); - } - - @Override - protected RestChannelConsumer prepareRequest(RestRequest restRequest, NodeClient nodeClient) - throws IOException { - switch (restRequest.method()) { - case POST: - return executePostRequest(restRequest, nodeClient); - default: - return restChannel - -> restChannel.sendResponse(new BytesRestResponse(RestStatus.METHOD_NOT_ALLOWED, - String.valueOf(restRequest.method()))); - } - } - - private RestChannelConsumer executePostRequest(RestRequest restRequest, - NodeClient nodeClient) throws IOException { - - DataSourceMetadata dataSourceMetadata - = XContentParserUtils.toDataSourceMetadata(restRequest.contentParser()); - return restChannel -> schedule(nodeClient, - () -> nodeClient.execute(TransportCreateDataSourceAction.ACTION_TYPE, - new CreateDataSourceActionRequest(dataSourceMetadata), - new ActionListener<>() { - @Override - public void onResponse( - CreateDataSourceActionResponse createDataSourceActionResponse) { - restChannel.sendResponse( - new BytesRestResponse(RestStatus.OK, "application/json; charset=UTF-8", - createDataSourceActionResponse.getResult())); - } - - @Override - public void onFailure(Exception e) { - if (e instanceof IllegalAccessException) { - reportError(restChannel, e, BAD_REQUEST); - } else { - LOG.error("Error happened during query handling", e); - if (isClientError(e)) { - Metrics.getInstance() - .getNumericalMetric(MetricName.DATASOURCE_FAILED_REQ_COUNT_CUS) - .increment(); - reportError(restChannel, e, BAD_REQUEST); - } else { - Metrics.getInstance() - .getNumericalMetric(MetricName.DATASOURCE_FAILED_REQ_COUNT_SYS) - .increment(); - reportError(restChannel, e, SERVICE_UNAVAILABLE); - } - } - } - })); - } - - private void reportError(final RestChannel channel, final Exception e, final RestStatus status) { - channel.sendResponse( - new BytesRestResponse( - status, ErrorMessageFactory.createErrorMessage(e, status.getStatus()).toString())); - } - - private static boolean isClientError(Exception e) { - return e instanceof NullPointerException - // NPE is hard to differentiate but more likely caused by bad query - || e instanceof IllegalArgumentException - || e instanceof IndexNotFoundException; - } - -} \ No newline at end of file diff --git a/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java b/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java index a67e077ecc5..acac65bd54f 100644 --- a/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java +++ b/plugin/src/main/java/org/opensearch/sql/plugin/transport/TransportPPLQueryAction.java @@ -21,7 +21,7 @@ import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.common.utils.QueryContext; import org.opensearch.sql.datasource.DataSourceService; -import org.opensearch.sql.datasource.DataSourceServiceImpl; +import org.opensearch.sql.datasources.service.DataSourceServiceImpl; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.legacy.metrics.MetricName; import org.opensearch.sql.legacy.metrics.Metrics; diff --git a/plugin/src/test/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorageTest.java b/plugin/src/test/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorageTest.java deleted file mode 100644 index 140d4e0edd4..00000000000 --- a/plugin/src/test/java/org/opensearch/sql/plugin/datasource/OpenSearchDataSourceMetadataStorageTest.java +++ /dev/null @@ -1,220 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.sql.plugin.datasource; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertFalse; -import static org.junit.Assert.assertThrows; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.times; -import static org.mockito.Mockito.verify; -import static org.mockito.Mockito.when; -import static org.opensearch.sql.plugin.datasource.OpenSearchDataSourceMetadataStorage.DATASOURCE_INDEX_NAME; - -import com.fasterxml.jackson.core.JsonProcessingException; -import com.fasterxml.jackson.databind.ObjectMapper; -import java.util.Collections; -import java.util.HashMap; -import java.util.Map; -import java.util.Optional; -import lombok.SneakyThrows; -import org.apache.lucene.search.TotalHits; -import org.junit.Test; -import org.junit.runner.RunWith; -import org.mockito.Answers; -import org.mockito.InjectMocks; -import org.mockito.Mock; -import org.mockito.junit.MockitoJUnitRunner; -import org.opensearch.action.ActionFuture; -import org.opensearch.action.admin.indices.create.CreateIndexResponse; -import org.opensearch.action.search.SearchResponse; -import org.opensearch.client.Client; -import org.opensearch.cluster.service.ClusterService; -import org.opensearch.rest.RestStatus; -import org.opensearch.search.SearchHit; -import org.opensearch.search.SearchHits; -import org.opensearch.sql.common.encryptor.Encryptor; -import org.opensearch.sql.datasource.model.DataSourceMetadata; -import org.opensearch.sql.datasource.model.DataSourceType; - -@RunWith(MockitoJUnitRunner.class) -public class OpenSearchDataSourceMetadataStorageTest { - - private static final String TEST_DATASOURCE_INDEX_NAME = "testDS"; - - @Mock(answer = Answers.RETURNS_DEEP_STUBS) - private Client client; - @Mock(answer = Answers.RETURNS_DEEP_STUBS) - private ClusterService clusterService; - @Mock - private Encryptor encryptor; - @Mock(answer = Answers.RETURNS_DEEP_STUBS) - private SearchResponse searchResponse; - @Mock - private ActionFuture searchResponseActionFuture; - @Mock - private ActionFuture createIndexResponseActionFuture; - @Mock - private SearchHit searchHit; - @InjectMocks - private OpenSearchDataSourceMetadataStorage openSearchDataSourceMetadataStorage; - - - @SneakyThrows - @Test - public void testGetDataSourceMetadata() { - when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) - .thenReturn(true); - when(client.search(any())).thenReturn(searchResponseActionFuture); - when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); - when(searchResponse.status()).thenReturn(RestStatus.OK); - when(searchResponse.getHits()) - .thenReturn( - new SearchHits( - new SearchHit[] {searchHit}, - new TotalHits(21, TotalHits.Relation.EQUAL_TO), - 1.0F)); - when(searchHit.getSourceAsString()) - .thenReturn(getBasicDataSourceMetadataString()); - when(encryptor.decrypt("password")).thenReturn("password"); - when(encryptor.decrypt("username")).thenReturn("username"); - - Optional dataSourceMetadataOptional - = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); - - - assertFalse(dataSourceMetadataOptional.isEmpty()); - DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); - assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); - assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); - assertEquals("password", - dataSourceMetadata.getProperties().get("prometheus.auth.password")); - assertEquals("username", - dataSourceMetadata.getProperties().get("prometheus.auth.username")); - assertEquals("basicauth", - dataSourceMetadata.getProperties().get("prometheus.auth.type")); - } - - @SneakyThrows - @Test - public void testGetDataSourceMetadataWithAWSSigV4() { - when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) - .thenReturn(true); - when(client.search(any())).thenReturn(searchResponseActionFuture); - when(searchResponseActionFuture.actionGet()).thenReturn(searchResponse); - when(searchResponse.status()).thenReturn(RestStatus.OK); - when(searchResponse.getHits()) - .thenReturn( - new SearchHits( - new SearchHit[] {searchHit}, - new TotalHits(21, TotalHits.Relation.EQUAL_TO), - 1.0F)); - when(searchHit.getSourceAsString()) - .thenReturn(getAWSSigv4DataSourceMetadataString()); - when(encryptor.decrypt("secret_key")).thenReturn("secret_key"); - when(encryptor.decrypt("access_key")).thenReturn("access_key"); - - Optional dataSourceMetadataOptional - = openSearchDataSourceMetadataStorage.getDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME); - - - assertFalse(dataSourceMetadataOptional.isEmpty()); - DataSourceMetadata dataSourceMetadata = dataSourceMetadataOptional.get(); - assertEquals(TEST_DATASOURCE_INDEX_NAME, dataSourceMetadata.getName()); - assertEquals(DataSourceType.PROMETHEUS, dataSourceMetadata.getConnector()); - assertEquals("secret_key", - dataSourceMetadata.getProperties().get("prometheus.auth.secret_key")); - assertEquals("access_key", - dataSourceMetadata.getProperties().get("prometheus.auth.access_key")); - assertEquals("awssigv4", - dataSourceMetadata.getProperties().get("prometheus.auth.type")); - } - - @Test - public void testCreateDataSourceMetadata() { - - when(clusterService.state().routingTable().hasIndex(DATASOURCE_INDEX_NAME)) - .thenReturn(Boolean.FALSE); - when(encryptor.encrypt("secret_key")).thenReturn("secret_key"); - when(encryptor.encrypt("access_key")).thenReturn("access_key"); - when(client.admin().indices().create(any())) - .thenReturn(createIndexResponseActionFuture); - when(createIndexResponseActionFuture.actionGet()) - .thenReturn(new CreateIndexResponse(true, true, DATASOURCE_INDEX_NAME)); - DataSourceMetadata dataSourceMetadata = getDataSourceMetadata(); - - this.openSearchDataSourceMetadataStorage.createDataSourceMetadata(dataSourceMetadata); - - verify(encryptor, times(1)).encrypt("secret_key"); - verify(encryptor, times(1)).encrypt("access_key"); - verify(client.admin().indices(), times(1)).create(any()); - verify(client, times(1)).index(any()); - verify(client.threadPool().getThreadContext(), times(2)).stashContext(); - - - } - - @Test - public void testUpdateDataSourceMetadata() { - assertThrows( - UnsupportedOperationException.class, - () -> openSearchDataSourceMetadataStorage - .updateDataSourceMetadata(new DataSourceMetadata())); - } - - @Test - public void testDeleteDataSourceMetadata() { - assertThrows( - UnsupportedOperationException.class, - () -> openSearchDataSourceMetadataStorage - .deleteDataSourceMetadata(TEST_DATASOURCE_INDEX_NAME)); - } - - private String getBasicDataSourceMetadataString() throws JsonProcessingException { - DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); - dataSourceMetadata.setName("testDS"); - dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); - dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); - Map properties = new HashMap<>(); - properties.put("prometheus.auth.type", "basicauth"); - properties.put("prometheus.auth.username", "username"); - properties.put("prometheus.auth.uri", "https://localhost:9090"); - properties.put("prometheus.auth.password", "password"); - dataSourceMetadata.setProperties(properties); - ObjectMapper objectMapper = new ObjectMapper(); - return objectMapper.writeValueAsString(dataSourceMetadata); - } - - private String getAWSSigv4DataSourceMetadataString() throws JsonProcessingException { - DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); - dataSourceMetadata.setName("testDS"); - dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); - dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); - Map properties = new HashMap<>(); - properties.put("prometheus.auth.type", "awssigv4"); - properties.put("prometheus.auth.secret_key", "secret_key"); - properties.put("prometheus.auth.uri", "https://localhost:9090"); - properties.put("prometheus.auth.access_key", "access_key"); - dataSourceMetadata.setProperties(properties); - ObjectMapper objectMapper = new ObjectMapper(); - return objectMapper.writeValueAsString(dataSourceMetadata); - } - - private DataSourceMetadata getDataSourceMetadata() { - DataSourceMetadata dataSourceMetadata = new DataSourceMetadata(); - dataSourceMetadata.setName("testDS"); - dataSourceMetadata.setConnector(DataSourceType.PROMETHEUS); - dataSourceMetadata.setAllowedRoles(Collections.singletonList("prometheus_access")); - Map properties = new HashMap<>(); - properties.put("prometheus.auth.type", "awssigv4"); - properties.put("prometheus.auth.secret_key", "secret_key"); - properties.put("prometheus.auth.uri", "https://localhost:9090"); - properties.put("prometheus.auth.access_key", "access_key"); - dataSourceMetadata.setProperties(properties); - return dataSourceMetadata; - } - -} diff --git a/prometheus/build.gradle b/prometheus/build.gradle index ca70813e58b..b0c05f1bc83 100644 --- a/prometheus/build.gradle +++ b/prometheus/build.gradle @@ -16,6 +16,7 @@ repositories { dependencies { api project(':core') + implementation project(':datasources') implementation "io.github.resilience4j:resilience4j-retry:1.5.0" implementation group: 'com.fasterxml.jackson.core', name: 'jackson-core', version: "${versions.jackson}" implementation group: 'com.fasterxml.jackson.core', name: 'jackson-databind', version: "${versions.jackson_databind}" diff --git a/prometheus/src/main/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactory.java b/prometheus/src/main/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactory.java index dbc753f1f54..4a0f52f4a58 100644 --- a/prometheus/src/main/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactory.java +++ b/prometheus/src/main/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactory.java @@ -21,7 +21,7 @@ import org.opensearch.sql.datasource.model.DataSource; import org.opensearch.sql.datasource.model.DataSourceMetadata; import org.opensearch.sql.datasource.model.DataSourceType; -import org.opensearch.sql.datasource.model.auth.AuthenticationType; +import org.opensearch.sql.datasources.auth.AuthenticationType; import org.opensearch.sql.prometheus.authinterceptors.AwsSigningInterceptor; import org.opensearch.sql.prometheus.authinterceptors.BasicAuthenticationInterceptor; import org.opensearch.sql.prometheus.client.PrometheusClient; @@ -39,6 +39,8 @@ public class PrometheusStorageFactory implements DataSourceFactory { public static final String ACCESS_KEY = "prometheus.auth.access_key"; public static final String SECRET_KEY = "prometheus.auth.secret_key"; + private static final Integer MAX_LENGTH_FOR_CONFIG_PROPERTY = 1000; + @Override public DataSourceType getDataSourceType() { return DataSourceType.PROMETHEUS; @@ -52,8 +54,24 @@ public DataSource createDataSource(DataSourceMetadata metadata) { getStorageEngine(metadata.getName(), metadata.getProperties())); } + + private void validateDataSourceConfigProperties(Map dataSourceMetadataConfig) { + if (dataSourceMetadataConfig.get(AUTH_TYPE) != null) { + AuthenticationType authenticationType + = AuthenticationType.get(dataSourceMetadataConfig.get(AUTH_TYPE)); + if (AuthenticationType.BASICAUTH.equals(authenticationType)) { + validateFields(dataSourceMetadataConfig, Set.of(URI, USERNAME, PASSWORD)); + } else if (AuthenticationType.AWSSIGV4AUTH.equals(authenticationType)) { + validateFields(dataSourceMetadataConfig, Set.of(URI, ACCESS_KEY, SECRET_KEY, + REGION)); + } + } else { + validateFields(dataSourceMetadataConfig, Set.of(URI)); + } + } + StorageEngine getStorageEngine(String catalogName, Map requiredConfig) { - validateFieldsInConfig(requiredConfig, Set.of(URI)); + validateDataSourceConfigProperties(requiredConfig); PrometheusClient prometheusClient; prometheusClient = AccessController.doPrivileged((PrivilegedAction) () -> { @@ -76,11 +94,9 @@ private OkHttpClient getHttpClient(Map config) { if (config.get(AUTH_TYPE) != null) { AuthenticationType authenticationType = AuthenticationType.get(config.get(AUTH_TYPE)); if (AuthenticationType.BASICAUTH.equals(authenticationType)) { - validateFieldsInConfig(config, Set.of(USERNAME, PASSWORD)); okHttpClient.addInterceptor(new BasicAuthenticationInterceptor(config.get(USERNAME), config.get(PASSWORD))); } else if (AuthenticationType.AWSSIGV4AUTH.equals(authenticationType)) { - validateFieldsInConfig(config, Set.of(REGION, ACCESS_KEY, SECRET_KEY)); okHttpClient.addInterceptor(new AwsSigningInterceptor( new AWSStaticCredentialsProvider( new BasicAWSCredentials(config.get(ACCESS_KEY), config.get(SECRET_KEY))), @@ -94,17 +110,29 @@ private OkHttpClient getHttpClient(Map config) { return okHttpClient.build(); } - private void validateFieldsInConfig(Map config, Set fields) { + private void validateFields(Map config, Set fields) { Set missingFields = new HashSet<>(); + Set invalidLengthFields = new HashSet<>(); for (String field : fields) { if (!config.containsKey(field)) { missingFields.add(field); + } else if (config.get(field).length() > MAX_LENGTH_FOR_CONFIG_PROPERTY) { + invalidLengthFields.add(field); } } + StringBuilder errorStringBuilder = new StringBuilder(); if (missingFields.size() > 0) { - throw new IllegalArgumentException(String.format( + errorStringBuilder.append(String.format( "Missing %s fields in the Prometheus connector properties.", missingFields)); } + + if (invalidLengthFields.size() > 0) { + errorStringBuilder.append(String.format( + "Fields %s exceeds more than 1000 characters.", invalidLengthFields)); + } + if (errorStringBuilder.length() > 0) { + throw new IllegalArgumentException(errorStringBuilder.toString()); + } } diff --git a/prometheus/src/test/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactoryTest.java b/prometheus/src/test/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactoryTest.java index 91cb8df1ea7..36f7e5b5f13 100644 --- a/prometheus/src/test/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactoryTest.java +++ b/prometheus/src/test/java/org/opensearch/sql/prometheus/storage/PrometheusStorageFactoryTest.java @@ -9,6 +9,7 @@ import java.util.HashMap; import lombok.SneakyThrows; +import org.apache.commons.lang3.RandomStringUtils; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; @@ -90,6 +91,24 @@ void testGetStorageEngineWithMissingRegionInAWS() { exception.getMessage()); } + + @Test + @SneakyThrows + void testGetStorageEngineWithLongConfigProperties() { + PrometheusStorageFactory prometheusStorageFactory = new PrometheusStorageFactory(); + HashMap properties = new HashMap<>(); + properties.put("prometheus.uri", RandomStringUtils.random(1001)); + properties.put("prometheus.auth.type", "awssigv4"); + properties.put("prometheus.auth.secret_key", "accessKey"); + properties.put("prometheus.auth.access_key", "secretKey"); + IllegalArgumentException exception = Assertions.assertThrows(IllegalArgumentException.class, + () -> prometheusStorageFactory.getStorageEngine("my_prometheus", properties)); + Assertions.assertEquals("Missing [prometheus.auth.region] fields in the " + + "Prometheus connector properties." + + "Fields [prometheus.uri] exceeds more than 1000 characters.", + exception.getMessage()); + } + @Test @SneakyThrows void testGetStorageEngineWithWrongAuthType() { diff --git a/settings.gradle b/settings.gradle index 7def8a746c3..6f7214cb3ad 100644 --- a/settings.gradle +++ b/settings.gradle @@ -19,3 +19,4 @@ include 'legacy' include 'sql' include 'prometheus' include 'benchmarks' +include 'datasources' \ No newline at end of file diff --git a/sql/src/main/antlr/OpenSearchSQLLexer.g4 b/sql/src/main/antlr/OpenSearchSQLLexer.g4 index 616bfa8a79c..b65f60e2899 100644 --- a/sql/src/main/antlr/OpenSearchSQLLexer.g4 +++ b/sql/src/main/antlr/OpenSearchSQLLexer.g4 @@ -134,7 +134,6 @@ STDDEV_SAMP: 'STDDEV_SAMP'; SUBSTRING: 'SUBSTRING'; TRIM: 'TRIM'; - // Keywords, but can be ID // Common Keywords, but can be ID @@ -328,6 +327,8 @@ REVERSE_NESTED: 'REVERSE_NESTED'; QUERY: 'QUERY'; RANGE: 'RANGE'; SCORE: 'SCORE'; +SCOREQUERY: 'SCOREQUERY'; +SCORE_QUERY: 'SCORE_QUERY'; SECOND_OF_MINUTE: 'SECOND_OF_MINUTE'; STATS: 'STATS'; TERM: 'TERM'; @@ -465,7 +466,6 @@ BACKTICK_QUOTE_ID: BQUOTA_STRING; // Fragments for Literal primitives fragment EXPONENT_NUM_PART: 'E' [-+]? DEC_DIGIT+; -fragment ID_LITERAL: [@*A-Z]+?[*A-Z_\-0-9]*; fragment DQUOTA_STRING: '"' ( '\\'. | '""' | ~('"'| '\\') )* '"'; fragment SQUOTA_STRING: '\'' ('\\'. | '\'\'' | ~('\'' | '\\'))* '\''; fragment BQUOTA_STRING: '`' ( '\\'. | '``' | ~('`'|'\\'))* '`'; @@ -473,6 +473,10 @@ fragment HEX_DIGIT: [0-9A-F]; fragment DEC_DIGIT: [0-9]; fragment BIT_STRING_L: 'B' '\'' [01]+ '\''; +// Identifiers cannot start with a single '_' since this an OpenSearch reserved +// metadata field. Two underscores (or more) is acceptable, such as '__field'. +fragment ID_LITERAL: ([@*A-Z_])+?[*A-Z_\-0-9]*; + // Last tokens must generate Errors ERROR_RECOGNITION: . -> channel(ERRORCHANNEL); diff --git a/sql/src/main/antlr/OpenSearchSQLParser.g4 b/sql/src/main/antlr/OpenSearchSQLParser.g4 index cf2aa392de6..ebc2b8747ed 100644 --- a/sql/src/main/antlr/OpenSearchSQLParser.g4 +++ b/sql/src/main/antlr/OpenSearchSQLParser.g4 @@ -306,6 +306,7 @@ functionCall | windowFunctionClause #windowFunctionCall | aggregateFunction #aggregateFunctionCall | aggregateFunction (orderByClause)? filterClause #filteredAggregationFunctionCall + | scoreRelevanceFunction #scoreRelevanceFunctionCall | relevanceFunction #relevanceFunctionCall | highlightFunction #highlightFunctionCall | positionFunction #positionFunctionCall @@ -387,6 +388,7 @@ scalarFunctionName | textFunctionName | flowControlFunctionName | systemFunctionName + | nestedFunctionName ; specificFunction @@ -399,7 +401,10 @@ specificFunction relevanceFunction : noFieldRelevanceFunction | singleFieldRelevanceFunction | multiFieldRelevanceFunction | altSingleFieldRelevanceFunction | altMultiFieldRelevanceFunction + ; +scoreRelevanceFunction + : scoreRelevanceFunctionName LR_BRACKET relevanceFunction (COMMA weight=relevanceFieldWeight)? RR_BRACKET ; noFieldRelevanceFunction @@ -557,6 +562,14 @@ systemFunctionName : TYPEOF ; +nestedFunctionName + : NESTED + ; + +scoreRelevanceFunctionName + : SCORE | SCOREQUERY | SCORE_QUERY + ; + singleFieldRelevanceFunctionName : MATCH | MATCHQUERY | MATCH_QUERY | MATCH_PHRASE | MATCHPHRASE | MATCHPHRASEQUERY diff --git a/sql/src/main/java/org/opensearch/sql/sql/parser/AstExpressionBuilder.java b/sql/src/main/java/org/opensearch/sql/sql/parser/AstExpressionBuilder.java index 71a5c6c4874..bad0543e02c 100644 --- a/sql/src/main/java/org/opensearch/sql/sql/parser/AstExpressionBuilder.java +++ b/sql/src/main/java/org/opensearch/sql/sql/parser/AstExpressionBuilder.java @@ -53,6 +53,7 @@ import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.RelevanceFieldAndWeightContext; import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.ScalarFunctionCallContext; import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.ScalarWindowFunctionContext; +import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.ScoreRelevanceFunctionContext; import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.ShowDescribePatternContext; import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.SignedDecimalContext; import static org.opensearch.sql.sql.antlr.parser.OpenSearchSQLParser.SignedRealContext; @@ -93,6 +94,7 @@ import org.opensearch.sql.ast.expression.Or; import org.opensearch.sql.ast.expression.QualifiedName; import org.opensearch.sql.ast.expression.RelevanceFieldList; +import org.opensearch.sql.ast.expression.ScoreFunction; import org.opensearch.sql.ast.expression.UnresolvedArgument; import org.opensearch.sql.ast.expression.UnresolvedExpression; import org.opensearch.sql.ast.expression.When; @@ -188,7 +190,7 @@ public UnresolvedExpression visitPositionFunction( return new Function( POSITION.getName().getFunctionName(), Arrays.asList(visitFunctionArg(ctx.functionArg(0)), - visitFunctionArg(ctx.functionArg(1)))); + visitFunctionArg(ctx.functionArg(1)))); } @Override @@ -466,7 +468,7 @@ public UnresolvedExpression visitMultiFieldRelevanceFunction( if ((funcName.equalsIgnoreCase(BuiltinFunctionName.MULTI_MATCH.toString()) || funcName.equalsIgnoreCase(BuiltinFunctionName.MULTIMATCH.toString()) || funcName.equalsIgnoreCase(BuiltinFunctionName.MULTIMATCHQUERY.toString())) - && ! ctx.getRuleContexts(AlternateMultiMatchQueryContext.class) + && !ctx.getRuleContexts(AlternateMultiMatchQueryContext.class) .isEmpty()) { return new Function( ctx.multiFieldRelevanceFunctionName().getText().toLowerCase(), @@ -486,6 +488,20 @@ public UnresolvedExpression visitAltMultiFieldRelevanceFunction( altMultiFieldRelevanceFunctionArguments(ctx)); } + /** + * Visit score-relevance function and collect children. + * + * @param ctx the parse tree + * @return children + */ + public UnresolvedExpression visitScoreRelevanceFunction(ScoreRelevanceFunctionContext ctx) { + Literal weight = + ctx.weight == null + ? new Literal(Double.valueOf(1.0), DataType.DOUBLE) + : new Literal(Double.parseDouble(ctx.weight.getText()), DataType.DOUBLE); + return new ScoreFunction(visit(ctx.relevanceFunction()), weight); + } + private Function buildFunction(String functionName, List arg) { return new Function( @@ -510,8 +526,7 @@ private QualifiedName visitIdentifiers(List identifiers) { identifiers.stream() .map(RuleContext::getText) .map(StringUtils::unquoteIdentifier) - .collect(Collectors.toList()) - ); + .collect(Collectors.toList())); } private void fillRelevanceArgs(List args, @@ -605,6 +620,7 @@ private List timestampFunctionArguments( /** * Adds support for multi_match alternate syntax like * MULTI_MATCH('query'='Dale', 'fields'='*name'). + * * @param ctx : Context for multi field relevance function. * @return : Returns list of all arguments for relevance function. */ @@ -617,7 +633,7 @@ private List alternateMultiMatchArguments( String[] fieldAndWeights = StringUtils.unquoteText( ctx.getRuleContexts(AlternateMultiMatchFieldContext.class) - .stream().findFirst().get().argVal.getText()).split(","); + .stream().findFirst().get().argVal.getText()).split(","); for (var fieldAndWeight : fieldAndWeights) { String[] splitFieldAndWeights = fieldAndWeight.split("\\^"); @@ -629,9 +645,10 @@ private List alternateMultiMatchArguments( ctx.getRuleContexts(AlternateMultiMatchQueryContext.class) .stream().findFirst().ifPresent( - arg -> - builder.add(new UnresolvedArgument("query", - new Literal(StringUtils.unquoteText(arg.argVal.getText()), DataType.STRING))) + arg -> + builder.add(new UnresolvedArgument("query", + new Literal( + StringUtils.unquoteText(arg.argVal.getText()), DataType.STRING))) ); fillRelevanceArgs(ctx.relevanceArg(), builder); diff --git a/sql/src/test/java/org/opensearch/sql/sql/antlr/SQLSyntaxParserTest.java b/sql/src/test/java/org/opensearch/sql/sql/antlr/SQLSyntaxParserTest.java index 5912a76f28d..39fe8811b5c 100644 --- a/sql/src/test/java/org/opensearch/sql/sql/antlr/SQLSyntaxParserTest.java +++ b/sql/src/test/java/org/opensearch/sql/sql/antlr/SQLSyntaxParserTest.java @@ -635,6 +635,18 @@ public void can_parse_wildcard_query_relevance_function() { + "boost=1.5, case_insensitive=true, rewrite=\"scoring_boolean\")")); } + @Test + public void can_parse_nested_function() { + assertNotNull( + parser.parse("SELECT NESTED(FIELD.DAYOFWEEK) FROM TEST")); + assertNotNull( + parser.parse("SELECT NESTED('FIELD.DAYOFWEEK') FROM TEST")); + assertNotNull( + parser.parse("SELECT SUM(NESTED(FIELD.SUBFIELD)) FROM TEST")); + assertNotNull( + parser.parse("SELECT NESTED(FIELD.DAYOFWEEK, PATH) FROM TEST")); + } + @Test public void can_parse_yearweek_function() { assertNotNull(parser.parse("SELECT yearweek('1987-01-01')")); diff --git a/sql/src/test/java/org/opensearch/sql/sql/parser/AstBuilderTest.java b/sql/src/test/java/org/opensearch/sql/sql/parser/AstBuilderTest.java index 64a7445dc85..2f19fc1f3fa 100644 --- a/sql/src/test/java/org/opensearch/sql/sql/parser/AstBuilderTest.java +++ b/sql/src/test/java/org/opensearch/sql/sql/parser/AstBuilderTest.java @@ -33,21 +33,13 @@ import com.google.common.collect.ImmutableList; import java.util.HashMap; -import java.util.List; import java.util.Map; -import java.util.stream.Stream; -import org.antlr.v4.runtime.tree.ParseTree; import org.junit.jupiter.api.Test; -import org.junit.jupiter.params.ParameterizedTest; -import org.junit.jupiter.params.provider.Arguments; -import org.junit.jupiter.params.provider.MethodSource; import org.opensearch.sql.ast.dsl.AstDSL; import org.opensearch.sql.ast.expression.AllFields; import org.opensearch.sql.ast.expression.DataType; import org.opensearch.sql.ast.expression.Literal; -import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.common.antlr.SyntaxCheckException; -import org.opensearch.sql.sql.antlr.SQLSyntaxParser; class AstBuilderTest extends AstBuilderTestBase { @@ -696,5 +688,4 @@ public void can_build_string_literal_highlight() { buildAST("SELECT highlight(\"fieldA\") FROM test") ); } - } diff --git a/sql/src/test/java/org/opensearch/sql/sql/parser/AstExpressionBuilderTest.java b/sql/src/test/java/org/opensearch/sql/sql/parser/AstExpressionBuilderTest.java index 52dd5e35721..20655bc0202 100644 --- a/sql/src/test/java/org/opensearch/sql/sql/parser/AstExpressionBuilderTest.java +++ b/sql/src/test/java/org/opensearch/sql/sql/parser/AstExpressionBuilderTest.java @@ -36,6 +36,7 @@ import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; import java.util.HashMap; +import java.util.stream.Stream; import org.antlr.v4.runtime.CommonTokenStream; import org.apache.commons.lang3.tuple.ImmutablePair; import org.junit.jupiter.api.Test; @@ -463,6 +464,26 @@ public void canBuildKeywordsAsIdentInQualifiedName() { ); } + @Test + public void canBuildMetaDataFieldAsQualifiedName() { + Stream.of("_id", "_index", "_sort", "_score", "_maxscore").forEach( + field -> assertEquals( + qualifiedName(field), + buildExprAst(field) + ) + ); + } + + @Test + public void canBuildNonMetaDataFieldAsQualifiedName() { + Stream.of("id", "__id", "_routing", "___field").forEach( + field -> assertEquals( + qualifiedName(field), + buildExprAst(field) + ) + ); + } + @Test public void canCastFieldAsString() { assertEquals( @@ -798,6 +819,36 @@ public void relevanceWildcard_query() { ); } + @Test + public void relevanceScore_query() { + assertEquals( + AstDSL.score( + AstDSL.function("query_string", + unresolvedArg("fields", new RelevanceFieldList(ImmutableMap.of( + "field1", 1.F, "field2", 3.2F))), + unresolvedArg("query", stringLiteral("search query")) + ), + AstDSL.doubleLiteral(1.0) + ), + buildExprAst("score(query_string(['field1', 'field2' ^ 3.2], 'search query'))") + ); + } + + @Test + public void relevanceScore_withBoost_query() { + assertEquals( + AstDSL.score( + AstDSL.function("query_string", + unresolvedArg("fields", new RelevanceFieldList(ImmutableMap.of( + "field1", 1.F, "field2", 3.2F))), + unresolvedArg("query", stringLiteral("search query")) + ), + doubleLiteral(1.0) + ), + buildExprAst("score(query_string(['field1', 'field2' ^ 3.2], 'search query'), 1.0)") + ); + } + @Test public void relevanceQuery() { assertEquals(AstDSL.function("query", From df54e79217af238763ec73bba50a6d9cc950f157 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Fri, 14 Apr 2023 20:09:39 -0700 Subject: [PATCH 10/17] Minor cleanup. Signed-off-by: Yury-Fridlyand --- .../sql/planner/optimizer/pattern/Patterns.java | 10 ---------- .../sql/planner/optimizer/pattern/PatternsTest.java | 10 ---------- 2 files changed, 20 deletions(-) diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java index 0cb540743eb..c8f3e403fde 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java @@ -121,16 +121,6 @@ public static Property table() { : Optional.empty()); } - /** - * Logical pagination with page size. - */ - public static Property pagination() { - return Property.optionalProperty("pagination", - plan -> plan instanceof LogicalPaginate - ? Optional.of(((LogicalPaginate) plan).getPageSize()) - : Optional.empty()); - } - /** * Logical write with table field. */ diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java index 1fd572e7daf..ef310e3b0e0 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/pattern/PatternsTest.java @@ -41,14 +41,4 @@ void table_is_empty() { () -> assertFalse(Patterns.writeTable().getFunction().apply(plan).isPresent()) ); } - - @Test - void pagination() { - assertAll( - () -> assertTrue(Patterns.pagination().getFunction() - .apply(mock(LogicalPaginate.class)).isPresent()), - () -> assertFalse(Patterns.pagination().getFunction() - .apply(mock(LogicalFilter.class)).isPresent()) - ); - } } From ce509b0808fdd906c0c29f008566fddeb722a3f7 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Mon, 17 Apr 2023 10:25:36 -0700 Subject: [PATCH 11/17] Minor cleanup - missing changes for the previous commit. Signed-off-by: Yury-Fridlyand --- .../org/opensearch/sql/planner/optimizer/pattern/Patterns.java | 1 - 1 file changed, 1 deletion(-) diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java index c8f3e403fde..8f5ac865807 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/pattern/Patterns.java @@ -17,7 +17,6 @@ import org.opensearch.sql.planner.logical.LogicalHighlight; import org.opensearch.sql.planner.logical.LogicalLimit; import org.opensearch.sql.planner.logical.LogicalNested; -import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalProject; import org.opensearch.sql.planner.logical.LogicalRelation; From dd3df8c62af06d0db662454a0eeb54b23042ab93 Mon Sep 17 00:00:00 2001 From: Max Ksyunz Date: Mon, 17 Apr 2023 11:58:44 -0700 Subject: [PATCH 12/17] Remove paginate operator (#1528) * Remove PaginateOperator class since it is no longer used. --------- Signed-off-by: MaxKsyunz --- .../sql/planner/DefaultImplementor.java | 7 -- .../rule/CreatePagingTableScanBuilder.java | 2 +- .../planner/physical/PaginateOperator.java | 74 -------------- .../physical/PhysicalPlanNodeVisitor.java | 4 - .../sql/planner/DefaultImplementorTest.java | 8 -- .../optimizer/LogicalPlanOptimizerTest.java | 4 +- .../physical/PaginateOperatorTest.java | 99 ------------------- .../physical/PhysicalPlanNodeVisitorTest.java | 8 -- .../OpenSearchExecutionProtector.java | 7 -- .../OpenSearchExecutionEngineTest.java | 47 +-------- .../OpenSearchExecutionProtectorTest.java | 8 -- 11 files changed, 4 insertions(+), 264 deletions(-) delete mode 100644 core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java delete mode 100644 core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java diff --git a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java index 607a5af983b..9bde4ab6474 100644 --- a/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java +++ b/core/src/main/java/org/opensearch/sql/planner/DefaultImplementor.java @@ -12,7 +12,6 @@ import org.opensearch.sql.planner.logical.LogicalFilter; import org.opensearch.sql.planner.logical.LogicalLimit; import org.opensearch.sql.planner.logical.LogicalNested; -import org.opensearch.sql.planner.logical.LogicalPaginate; import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanNodeVisitor; import org.opensearch.sql.planner.logical.LogicalProject; @@ -29,7 +28,6 @@ import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; import org.opensearch.sql.planner.physical.NestedOperator; -import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; import org.opensearch.sql.planner.physical.RareTopNOperator; @@ -134,11 +132,6 @@ public PhysicalPlan visitLimit(LogicalLimit node, C context) { return new LimitOperator(visitChild(node, context), node.getLimit(), node.getOffset()); } - @Override - public PhysicalPlan visitPaginate(LogicalPaginate plan, C context) { - return new PaginateOperator(visitChild(plan, context), plan.getPageSize()); - } - @Override public PhysicalPlan visitTableScanBuilder(TableScanBuilder plan, C context) { return plan.build(); diff --git a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java index 3785945374d..c635400c333 100644 --- a/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java +++ b/core/src/main/java/org/opensearch/sql/planner/optimizer/rule/CreatePagingTableScanBuilder.java @@ -67,6 +67,6 @@ public LogicalPlan apply(LogicalPaginate plan, Captures captures) { var scan = logicalRelation.getTable().createPagedScanBuilder(plan.getPageSize()); relationParent.replaceChildPlans(List.of(scan)); - return plan; + return plan.getChild().get(0); } } diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java b/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java deleted file mode 100644 index 7601f7006aa..00000000000 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PaginateOperator.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.sql.planner.physical; - -import java.util.List; -import lombok.EqualsAndHashCode; -import lombok.Getter; -import lombok.RequiredArgsConstructor; -import org.opensearch.sql.data.model.ExprValue; -import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.planner.SerializablePlan; - -@EqualsAndHashCode(callSuper = false) -@RequiredArgsConstructor -public class PaginateOperator extends PhysicalPlan implements SerializablePlan { - @Getter - private final PhysicalPlan input; - - @Getter - private final int pageSize; - - /** - * Which page is this? - * May not be necessary in the end. Currently used to increment the "cursor counter" -- - * See usage. - */ - @Getter - private int pageIndex = 0; - - private int numReturned = 0; - - /** - * Page given physical plan, with pageSize elements per page, starting with the given page. - */ - public PaginateOperator(PhysicalPlan input, int pageSize, int pageIndex) { - this.pageSize = pageSize; - this.input = input; - this.pageIndex = pageIndex; - } - - @Override - public R accept(PhysicalPlanNodeVisitor visitor, C context) { - return visitor.visitPaginate(this, context); - } - - @Override - public boolean hasNext() { - return numReturned < pageSize && input.hasNext(); - } - - @Override - public ExprValue next() { - numReturned += 1; - return input.next(); - } - - public List getChild() { - return List.of(input); - } - - @Override - public ExecutionEngine.Schema schema() { - return input.schema(); - } - - /** No need to serialize a PaginateOperator, it actually does nothing - it is a wrapper. */ - @Override - public SerializablePlan getPlanForSerialization() { - return (SerializablePlan) input; - } -} diff --git a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java index bc4c0404c46..cb488700a03 100644 --- a/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java +++ b/core/src/main/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitor.java @@ -92,8 +92,4 @@ public R visitAD(PhysicalPlan node, C context) { public R visitML(PhysicalPlan node, C context) { return visitNode(node, context); } - - public R visitPaginate(PaginateOperator node, C context) { - return visitNode(node, context); - } } diff --git a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java index 768ab279311..bf1464f5f67 100644 --- a/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/DefaultImplementorTest.java @@ -58,7 +58,6 @@ import org.opensearch.sql.planner.logical.LogicalPlan; import org.opensearch.sql.planner.logical.LogicalPlanDSL; import org.opensearch.sql.planner.logical.LogicalRelation; -import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; import org.opensearch.sql.storage.Table; @@ -247,11 +246,4 @@ public TableWriteOperator build(PhysicalPlan child) { }; assertEquals(tableWriteOperator, logicalPlan.accept(implementor, null)); } - - @Test - public void visitPaginate_should_build_PaginateOperator_and_keep_page_size() { - var paginate = new LogicalPaginate(42, List.of(values())); - var plan = paginate.accept(implementor, null); - assertEquals(paginate.getPageSize(), ((PaginateOperator) plan).getPageSize()); - } } diff --git a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java index 2083fdef9cb..543b261d9ef 100644 --- a/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/optimizer/LogicalPlanOptimizerTest.java @@ -348,7 +348,7 @@ void paged_table_scan_builder_support_project_push_down_can_apply_its_rule() { var relation = relation("schema", table); assertEquals( - paginate(project(pagedTableScanBuilder), 4), + project(pagedTableScanBuilder), LogicalPlanOptimizer.create().optimize(paginate(project(relation), 4))); } @@ -401,7 +401,7 @@ void table_scan_builder_support_offset_push_down_can_apply_its_rule() { .optimize(new LogicalPaginate(42, List.of(project(relation)))); // `optimized` structure: LogicalPaginate -> LogicalProject -> TableScanBuilder // LogicalRelation replaced by a TableScanBuilder instance - assertEquals(paginate(project(pagedTableScanBuilder), 42), optimized); + assertEquals(project(pagedTableScanBuilder), optimized); } private LogicalPlan optimize(LogicalPlan plan) { diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java deleted file mode 100644 index 2405700f10b..00000000000 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PaginateOperatorTest.java +++ /dev/null @@ -1,99 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - - -package org.opensearch.sql.planner.physical; - -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertNull; -import static org.junit.jupiter.api.Assertions.assertSame; -import static org.junit.jupiter.api.Assertions.assertThrows; -import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.Mockito.CALLS_REAL_METHODS; -import static org.mockito.Mockito.doNothing; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.times; -import static org.mockito.Mockito.verify; -import static org.mockito.Mockito.when; -import static org.mockito.Mockito.withSettings; -import static org.opensearch.sql.data.type.ExprCoreType.INTEGER; -import static org.opensearch.sql.data.type.ExprCoreType.STRING; -import static org.opensearch.sql.planner.physical.PhysicalPlanDSL.project; - -import org.junit.jupiter.api.DisplayNameGeneration; -import org.junit.jupiter.api.DisplayNameGenerator; -import org.junit.jupiter.api.Test; -import org.opensearch.sql.data.model.ExprIntegerValue; -import org.opensearch.sql.expression.DSL; -import org.opensearch.sql.planner.SerializablePlan; - -@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) -public class PaginateOperatorTest extends PhysicalPlanTestBase { - - @Test - public void accept() { - var visitor = new PhysicalPlanNodeVisitor() {}; - assertNull(new PaginateOperator(null, 42).accept(visitor, null)); - } - - @Test - public void hasNext_a_page() { - var plan = mock(PhysicalPlan.class); - when(plan.hasNext()).thenReturn(true); - when(plan.next()).thenReturn(new ExprIntegerValue(42)).thenReturn(null); - var paginate = new PaginateOperator(plan, 1, 1); - assertTrue(paginate.hasNext()); - assertEquals(42, paginate.next().integerValue()); - paginate.next(); - assertFalse(paginate.hasNext()); - assertNull(paginate.next()); - } - - @Test - public void hasNext_no_more_entries() { - var plan = mock(PhysicalPlan.class); - when(plan.hasNext()).thenReturn(false); - var paginate = new PaginateOperator(plan, 1, 1); - assertFalse(paginate.hasNext()); - } - - @Test - public void getChild() { - var plan = mock(PhysicalPlan.class); - var paginate = new PaginateOperator(plan, 1); - assertSame(plan, paginate.getChild().get(0)); - } - - @Test - public void open() { - var plan = mock(PhysicalPlan.class); - doNothing().when(plan).open(); - new PaginateOperator(plan, 1).open(); - verify(plan, times(1)).open(); - } - - @Test - public void schema() { - PhysicalPlan project = project(null, - DSL.named("response", DSL.ref("response", INTEGER)), - DSL.named("action", DSL.ref("action", STRING), "act")); - assertEquals(project.schema(), new PaginateOperator(project, 42).schema()); - } - - @Test - public void schema_assert() { - var plan = mock(PhysicalPlan.class, withSettings().defaultAnswer(CALLS_REAL_METHODS)); - assertThrows(Throwable.class, () -> new PaginateOperator(plan, 42).schema()); - } - - @Test - // PaginateOperator implements SerializablePlan, but not being serialized - public void serializable_but_not_serialized() { - var plan = mock(PhysicalPlan.class, withSettings().extraInterfaces(SerializablePlan.class)); - var paginate = new PaginateOperator(plan, 1, 1); - assertSame(plan, paginate.getPlanForSerialization()); - } -} diff --git a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java index 2e6ce64ac62..fb687277ce2 100644 --- a/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/physical/PhysicalPlanNodeVisitorTest.java @@ -168,14 +168,6 @@ public void test_visitML() { assertNull(physicalPlanNodeVisitor.visitML(plan, null)); } - @Test - public void test_visitPaginate() { - PhysicalPlanNodeVisitor physicalPlanNodeVisitor = - new PhysicalPlanNodeVisitor() {}; - - assertNull(physicalPlanNodeVisitor.visitPaginate(new PaginateOperator(plan, 42), null)); - } - public static class PhysicalPlanPrinter extends PhysicalPlanNodeVisitor { public String print(PhysicalPlan node) { diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java index c46b0231a21..9d71cee8c94 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtector.java @@ -17,7 +17,6 @@ import org.opensearch.sql.planner.physical.FilterOperator; import org.opensearch.sql.planner.physical.LimitOperator; import org.opensearch.sql.planner.physical.NestedOperator; -import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.ProjectOperator; import org.opensearch.sql.planner.physical.RareTopNOperator; @@ -65,12 +64,6 @@ public PhysicalPlan visitRename(RenameOperator node, Object context) { return new RenameOperator(visitInput(node.getInput(), context), node.getMapping()); } - @Override - public PhysicalPlan visitPaginate(PaginateOperator node, Object context) { - return new PaginateOperator(visitInput(node.getInput(), context), node.getPageSize(), - node.getPageIndex()); - } - /** * Decorate with {@link ResourceMonitorPlan}. */ diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java index ae7319a223b..c96782abea4 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/OpenSearchExecutionEngineTest.java @@ -50,10 +50,8 @@ import org.opensearch.sql.opensearch.client.OpenSearchClient; import org.opensearch.sql.opensearch.data.value.OpenSearchExprValueFactory; import org.opensearch.sql.opensearch.executor.protector.OpenSearchExecutionProtector; -import org.opensearch.sql.opensearch.request.OpenSearchRequestBuilder; import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; import org.opensearch.sql.planner.SerializablePlan; -import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.storage.TableScanOperator; import org.opensearch.sql.storage.split.Split; @@ -120,7 +118,7 @@ void execute_with_cursor() { List expected = Arrays.asList( tupleValue(of("name", "John", "age", 20)), tupleValue(of("name", "Allen", "age", 30))); - FakePaginatePlan plan = new FakePaginatePlan(new FakePhysicalPlan(expected.iterator()), 10, 0); + var plan = new FakePhysicalPlan(expected.iterator()); when(protector.protect(plan)).thenReturn(plan); OpenSearchExecutionEngine executor = new OpenSearchExecutionEngine(client, protector, @@ -255,49 +253,6 @@ public void onFailure(Exception e) { assertTrue(plan.hasClosed); } - private static class FakePaginatePlan extends PaginateOperator { - private final PhysicalPlan input; - private final int pageSize; - private final int pageIndex; - - public FakePaginatePlan(PhysicalPlan input, int pageSize, int pageIndex) { - super(input, pageSize, pageIndex); - this.input = input; - this.pageSize = pageSize; - this.pageIndex = pageIndex; - } - - @Override - public void open() { - input.open(); - } - - @Override - public void close() { - input.close(); - } - - @Override - public void add(Split split) { - input.add(split); - } - - @Override - public boolean hasNext() { - return input.hasNext(); - } - - @Override - public ExprValue next() { - return input.next(); - } - - @Override - public ExecutionEngine.Schema schema() { - return input.schema(); - } - } - @RequiredArgsConstructor private static class FakePhysicalPlan extends TableScanOperator implements SerializablePlan { private final Iterator it; diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java index fd52d083815..fe0077914e2 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/protector/OpenSearchExecutionProtectorTest.java @@ -63,7 +63,6 @@ import org.opensearch.sql.opensearch.setting.OpenSearchSettings; import org.opensearch.sql.opensearch.storage.scan.OpenSearchIndexScan; import org.opensearch.sql.planner.physical.NestedOperator; -import org.opensearch.sql.planner.physical.PaginateOperator; import org.opensearch.sql.planner.physical.PhysicalPlan; import org.opensearch.sql.planner.physical.PhysicalPlanDSL; @@ -335,13 +334,6 @@ public void testVisitNested() { executionProtector.visitNested(nestedOperator, values(emptyList()))); } - @Test - public void visitPaginate() { - var paginate = new PaginateOperator(values(List.of()), 42); - assertEquals(executionProtector.protect(paginate), - executionProtector.visitPaginate(paginate, null)); - } - PhysicalPlan resourceMonitor(PhysicalPlan input) { return new ResourceMonitorPlan(input, resourceMonitor); } From 1f6cf70f1f4386deacf383c5b56e80b15aca88c9 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Mon, 17 Apr 2023 13:45:38 -0700 Subject: [PATCH 13/17] Remove `PaginatedPlan` - move logic to `QueryPlan`. Signed-off-by: Yury-Fridlyand --- .../execution/ContinuePaginatedPlan.java | 1 - .../sql/executor/execution/PaginatedPlan.java | 53 ---------- .../sql/executor/execution/QueryPlan.java | 35 ++++++- .../executor/execution/QueryPlanFactory.java | 2 +- .../executor/execution/PaginatedPlanTest.java | 99 ------------------- .../execution/QueryPlanFactoryTest.java | 2 +- .../sql/executor/execution/QueryPlanTest.java | 65 +++++++++++- 7 files changed, 97 insertions(+), 160 deletions(-) delete mode 100644 core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java delete mode 100644 core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java index ffbf2976878..eda65aba2da 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/ContinuePaginatedPlan.java @@ -15,7 +15,6 @@ /** * ContinuePaginatedPlan represents cursor a request. * It returns subsequent pages to the user (2nd page and all next). - * {@link PaginatedPlan} */ public class ContinuePaginatedPlan extends AbstractPlan { diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java deleted file mode 100644 index 5e217f13200..00000000000 --- a/core/src/main/java/org/opensearch/sql/executor/execution/PaginatedPlan.java +++ /dev/null @@ -1,53 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.sql.executor.execution; - -import org.apache.commons.lang3.NotImplementedException; -import org.opensearch.sql.ast.tree.Paginate; -import org.opensearch.sql.ast.tree.UnresolvedPlan; -import org.opensearch.sql.common.response.ResponseListener; -import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.QueryId; -import org.opensearch.sql.executor.QueryService; - -/** - * PaginatedPlan represents a page request. Dislike a regular QueryPlan, - * it returns paged response to the user and cursor, which allows to query - * next page. - * {@link ContinuePaginatedPlan} - */ -public class PaginatedPlan extends AbstractPlan { - final UnresolvedPlan plan; - final int fetchSize; - final QueryService queryService; - final ResponseListener - queryResponseResponseListener; - - /** - * Create an abstract plan that can start paging a query. - */ - public PaginatedPlan(QueryId queryId, UnresolvedPlan plan, int fetchSize, - QueryService queryService, - ResponseListener - queryResponseResponseListener) { - super(queryId); - this.plan = plan; - this.fetchSize = fetchSize; - this.queryService = queryService; - this.queryResponseResponseListener = queryResponseResponseListener; - } - - @Override - public void execute() { - queryService.execute(new Paginate(fetchSize, plan), queryResponseResponseListener); - } - - @Override - public void explain(ResponseListener listener) { - listener.onFailure(new NotImplementedException( - "`explain` feature for paginated requests is not implemented yet.")); - } -} diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlan.java b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlan.java index af5c032d493..df9bc0c7346 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlan.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlan.java @@ -8,6 +8,9 @@ package org.opensearch.sql.executor.execution; +import java.util.Optional; +import org.apache.commons.lang3.NotImplementedException; +import org.opensearch.sql.ast.tree.Paginate; import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.common.response.ResponseListener; import org.opensearch.sql.executor.ExecutionEngine; @@ -33,25 +36,51 @@ public class QueryPlan extends AbstractPlan { protected final ResponseListener listener; - /** constructor. */ + protected final Optional pageSize; + + /** Constructor. */ + public QueryPlan( + QueryId queryId, + UnresolvedPlan plan, + QueryService queryService, + ResponseListener listener) { + super(queryId); + this.plan = plan; + this.queryService = queryService; + this.listener = listener; + this.pageSize = Optional.empty(); + } + + /** Constructor with page size. */ public QueryPlan( QueryId queryId, UnresolvedPlan plan, + int pageSize, QueryService queryService, ResponseListener listener) { super(queryId); this.plan = plan; this.queryService = queryService; this.listener = listener; + this.pageSize = Optional.of(pageSize); } @Override public void execute() { - queryService.execute(plan, listener); + if (pageSize.isPresent()) { + queryService.execute(new Paginate(pageSize.get(), plan), listener); + } else { + queryService.execute(plan, listener); + } } @Override public void explain(ResponseListener listener) { - queryService.explain(plan, listener); + if (pageSize.isPresent()) { + listener.onFailure(new NotImplementedException( + "`explain` feature for paginated requests is not implemented yet.")); + } else { + queryService.explain(plan, listener); + } } } diff --git a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java index bdd978cccec..18455c2a021 100644 --- a/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java +++ b/core/src/main/java/org/opensearch/sql/executor/execution/QueryPlanFactory.java @@ -95,7 +95,7 @@ public AbstractPlan visitQuery( if (node.getFetchSize() > 0) { if (planSerializer.canConvertToCursor(node.getPlan())) { - return new PaginatedPlan(QueryId.queryId(), node.getPlan(), node.getFetchSize(), + return new QueryPlan(QueryId.queryId(), node.getPlan(), node.getFetchSize(), queryService, context.getLeft().get()); } else { diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java deleted file mode 100644 index 495dcbb050e..00000000000 --- a/core/src/test/java/org/opensearch/sql/executor/execution/PaginatedPlanTest.java +++ /dev/null @@ -1,99 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.sql.executor.execution; - -import static org.junit.jupiter.api.Assertions.assertNotNull; -import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.junit.jupiter.api.Assertions.fail; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - -import org.apache.commons.lang3.NotImplementedException; -import org.junit.jupiter.api.BeforeAll; -import org.junit.jupiter.api.DisplayNameGeneration; -import org.junit.jupiter.api.DisplayNameGenerator; -import org.junit.jupiter.api.Test; -import org.opensearch.sql.analysis.Analyzer; -import org.opensearch.sql.ast.tree.UnresolvedPlan; -import org.opensearch.sql.common.response.ResponseListener; -import org.opensearch.sql.executor.DefaultExecutionEngine; -import org.opensearch.sql.executor.ExecutionEngine; -import org.opensearch.sql.executor.QueryId; -import org.opensearch.sql.executor.QueryService; -import org.opensearch.sql.planner.Planner; -import org.opensearch.sql.planner.logical.LogicalPaginate; -import org.opensearch.sql.planner.physical.PhysicalPlan; - -@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) -public class PaginatedPlanTest { - - private static QueryService queryService; - - /** - * Initialize the mocks. - */ - @BeforeAll - public static void setUp() { - var analyzer = mock(Analyzer.class); - when(analyzer.analyze(any(), any())).thenReturn(mock(LogicalPaginate.class)); - var planner = mock(Planner.class); - when(planner.plan(any())).thenReturn(mock(PhysicalPlan.class)); - queryService = new QueryService(analyzer, new DefaultExecutionEngine(), planner); - } - - @Test - public void can_execute_plan() { - var listener = new ResponseListener() { - @Override - public void onResponse(ExecutionEngine.QueryResponse response) { - assertNotNull(response); - } - - @Override - public void onFailure(Exception e) { - fail(); - } - }; - var plan = new PaginatedPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, - queryService, listener); - plan.execute(); - } - - @Test - // Same as previous test, but with incomplete PaginatedQueryService - public void can_handle_error_while_executing_plan() { - var listener = new ResponseListener() { - @Override - public void onResponse(ExecutionEngine.QueryResponse response) { - fail(); - } - - @Override - public void onFailure(Exception e) { - assertNotNull(e); - } - }; - var plan = new PaginatedPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, - new QueryService(null, new DefaultExecutionEngine(), null), listener); - plan.execute(); - } - - @Test - public void explain_is_not_supported() { - new PaginatedPlan(null, null, 0, null, null).explain(new ResponseListener<>() { - @Override - public void onResponse(ExecutionEngine.ExplainResponse response) { - fail(); - } - - @Override - public void onFailure(Exception e) { - assertTrue(e instanceof NotImplementedException); - } - }); - } -} diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java index 5a4c7e9814b..6bdbf1c4c9d 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanFactoryTest.java @@ -130,7 +130,7 @@ public void createQueryWithFetchSizeWhichCanBePaged() { Statement query = new Query(plan, 10); AbstractPlan queryExecution = factory.createContinuePaginatedPlan(query, Optional.of(queryListener), Optional.empty()); - assertTrue(queryExecution instanceof PaginatedPlan); + assertTrue(queryExecution instanceof QueryPlan); } @Test diff --git a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanTest.java b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanTest.java index 834db76996e..a0a98e2be72 100644 --- a/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanTest.java +++ b/core/src/test/java/org/opensearch/sql/executor/execution/QueryPlanTest.java @@ -8,21 +8,30 @@ package org.opensearch.sql.executor.execution; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.mock; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; +import org.apache.commons.lang3.NotImplementedException; +import org.junit.jupiter.api.DisplayNameGeneration; +import org.junit.jupiter.api.DisplayNameGenerator; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.sql.ast.tree.UnresolvedPlan; import org.opensearch.sql.common.response.ResponseListener; +import org.opensearch.sql.executor.DefaultExecutionEngine; import org.opensearch.sql.executor.ExecutionEngine; import org.opensearch.sql.executor.QueryId; import org.opensearch.sql.executor.QueryService; @ExtendWith(MockitoExtension.class) +@DisplayNameGeneration(DisplayNameGenerator.ReplaceUnderscores.class) class QueryPlanTest { @Mock @@ -41,7 +50,7 @@ class QueryPlanTest { private ResponseListener queryListener; @Test - public void execute() { + public void execute_no_page_size() { QueryPlan query = new QueryPlan(queryId, plan, queryService, queryListener); query.execute(); @@ -49,10 +58,62 @@ public void execute() { } @Test - public void explain() { + public void explain_no_page_size() { QueryPlan query = new QueryPlan(queryId, plan, queryService, queryListener); query.explain(explainListener); verify(queryService, times(1)).explain(plan, explainListener); } + + @Test + public void can_execute_paginated_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + assertNotNull(response); + } + + @Override + public void onFailure(Exception e) { + fail(); + } + }; + var plan = new QueryPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, + queryService, listener); + plan.execute(); + } + + @Test + // Same as previous test, but with incomplete QueryService + public void can_handle_error_while_executing_plan() { + var listener = new ResponseListener() { + @Override + public void onResponse(ExecutionEngine.QueryResponse response) { + fail(); + } + + @Override + public void onFailure(Exception e) { + assertNotNull(e); + } + }; + var plan = new QueryPlan(QueryId.queryId(), mock(UnresolvedPlan.class), 10, + new QueryService(null, new DefaultExecutionEngine(), null), listener); + plan.execute(); + } + + @Test + public void explain_is_not_supported_for_pagination() { + new QueryPlan(null, null, 0, null, null).explain(new ResponseListener<>() { + @Override + public void onResponse(ExecutionEngine.ExplainResponse response) { + fail(); + } + + @Override + public void onFailure(Exception e) { + assertTrue(e instanceof NotImplementedException); + } + }); + } } From e745d2db6a9af37209cb76dd68b16135651a538e Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Mon, 17 Apr 2023 14:11:09 -0700 Subject: [PATCH 14/17] Remove default implementations from `SerializablePlan`. Signed-off-by: Yury-Fridlyand --- .../sql/planner/SerializablePlan.java | 11 ++--------- .../sql/planner/SerializablePlanTest.java | 10 ---------- .../protector/ResourceMonitorPlan.java | 18 +++++++++++++++++- .../executor/ResourceMonitorPlanTest.java | 7 +++++++ 4 files changed, 26 insertions(+), 20 deletions(-) diff --git a/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java b/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java index 220408b67d5..487b1da6bde 100644 --- a/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java +++ b/core/src/main/java/org/opensearch/sql/planner/SerializablePlan.java @@ -9,7 +9,6 @@ import java.io.IOException; import java.io.ObjectInput; import java.io.ObjectOutput; -import org.apache.commons.lang3.NotImplementedException; import org.opensearch.sql.executor.pagination.PlanSerializer; /** @@ -34,10 +33,7 @@ public interface SerializablePlan extends Externalizable { * Argument is an instance of {@link PlanSerializer.CursorDeserializationStream}. */ @Override - default void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { - throw new NotImplementedException(String.format("`readExternal` is not implemented in %s", - getClass().getSimpleName())); - } + void readExternal(ObjectInput in) throws IOException, ClassNotFoundException; /** * Each plan which has as a child plan should do. @@ -46,10 +42,7 @@ default void readExternal(ObjectInput in) throws IOException, ClassNotFoundExcep * } */ @Override - default void writeExternal(ObjectOutput out) throws IOException { - throw new NotImplementedException(String.format("`readExternal` is not implemented in %s", - getClass().getSimpleName())); - } + void writeExternal(ObjectOutput out) throws IOException; /** * Override to return child or delegated plan, so parent plan should skip this one diff --git a/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java b/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java index e40ce5031b8..8073445dc08 100644 --- a/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java +++ b/core/src/test/java/org/opensearch/sql/planner/SerializablePlanTest.java @@ -22,16 +22,6 @@ public class SerializablePlanTest { @Mock(answer = CALLS_REAL_METHODS) SerializablePlan plan; - @Test - void writeExternal_throws() { - assertThrows(Throwable.class, () -> plan.writeExternal(null)); - } - - @Test - void readExternal_throws() { - assertThrows(Throwable.class, () -> plan.readExternal(null)); - } - @Test void getPlanForSerialization_defaults_to_self() { assertSame(plan, plan.getPlanForSerialization()); diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java index 78283307510..3cc10e21198 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java @@ -6,6 +6,9 @@ package org.opensearch.sql.opensearch.executor.protector; +import java.io.IOException; +import java.io.ObjectInput; +import java.io.ObjectOutput; import java.util.List; import lombok.EqualsAndHashCode; import lombok.RequiredArgsConstructor; @@ -89,9 +92,22 @@ public long getTotalHits() { return delegate.getTotalHits(); } - @Override public SerializablePlan getPlanForSerialization() { return (SerializablePlan) delegate; } + + /** + * Those two methods should never be called. They called if a plan upper in the tree missed to + * call {@link #getPlanForSerialization}. + */ + @Override + public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException { + throw new UnsupportedOperationException(); + } + + @Override + public void writeExternal(ObjectOutput out) throws IOException { + throw new UnsupportedOperationException(); + } } diff --git a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java index 9ff7c093201..0b9f302ceba 100644 --- a/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java +++ b/opensearch/src/test/java/org/opensearch/sql/opensearch/executor/ResourceMonitorPlanTest.java @@ -123,4 +123,11 @@ void getPlanForSerialization() { monitorPlan = new ResourceMonitorPlan(plan, resourceMonitor); assertEquals(plan, monitorPlan.getPlanForSerialization()); } + + @Test + void notSerializable() { + // ResourceMonitorPlan shouldn't be serialized, attempt should throw an exception + assertThrows(UnsupportedOperationException.class, () -> monitorPlan.writeExternal(null)); + assertThrows(UnsupportedOperationException.class, () -> monitorPlan.readExternal(null)); + } } From 0f66341e636839d31eaa4e4adde96edf5cf2ccd2 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 19 Apr 2023 13:48:37 -0700 Subject: [PATCH 15/17] Add a doc. Signed-off-by: Yury-Fridlyand --- docs/dev/Pagination-v2.md | 281 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 281 insertions(+) create mode 100644 docs/dev/Pagination-v2.md diff --git a/docs/dev/Pagination-v2.md b/docs/dev/Pagination-v2.md new file mode 100644 index 00000000000..0541caf485b --- /dev/null +++ b/docs/dev/Pagination-v2.md @@ -0,0 +1,281 @@ +# Pagination in v2 Engine + +Pagination allows a SQL plugin client to retrieve arbitrarily large results sets one subset at a time. + +A cursor is a SQL abstraction for pagination. A client can open a cursor, retrieve a subset of data given a cursor and close a cursor. + +Currently, SQL plugin does not provide SQL cursor syntax. However, the SQL REST endpoint can return result a page at a time. This feature is used by JDBC and ODBC drivers. + + +# Scope +Currenty, V2 engine supports pagination only for simple `SELECT * FROM

` queries without any other clauses like `WHERE` or `ORDER BY`. + +# Demo +https://user-images.githubusercontent.com/88679692/224208630-8d38d833-abf8-4035-8d15-d5fb4382deca.mp4 + +# REST API +## Initial Query Request +``` +POST /_plugins/_sql +{ + "query" : "...", + "fetch_size": N +} +``` + +Response: +``` +{ + "cursor": /* cursor_id */, + "datarows": [ + // ... + ], + "schema" : [ + // ... + ] +} +``` +`query` is a DQL statement. `fetch_size` is a positive integer, indicating number of rows to return in each page. + +If `query` is a DML statement then pagination does not apply, the `fetch_size` parameter is ignored and a cursor is not created. This is existing behaviour in v1 engine. + +The client receives an (error response](#error-response) if: +- `fetch_size` is not a positive integer, or +- evaluating `query` results in a server-side error. + +## Next Page Request +``` +POST /_plugins/_sql +{ + "cursor": "" +} +``` +Similarly to v1 engine, the response object is the same as initial response if this is not the last page. + +`cursor_id` will be different with each request. + +If this is the last page, the `cursor` property is ommitted. The cursor is closed automatically. + +The client will receive an [error response](#error-response) if executing this request results in an OpenSearch or SQL plug-in error. + +## Cursor Keep Alive Timeout +Each cursor has a keep alive timer associated with it. When the timer runs out, the cursor is closed by OpenSearch. + +This timer is reset every time a page is retrieved. + +The client will receive an [error response](#error-response) if it sends a cursor request for an expired cursor. + +## Error Response +The client will receive an error response if any of the above REST calls result in an server-side error. + +The response object has the following format: +```json5 +{ + "error": { + "details": , + "reason": , + "type": + }, + "status": +} +``` + +`details`, `reason`, and `type` properties are string values. The exact values will depend on the error state encountered. +`status` is an HTTP status code + +## OpenSearch Data Retrieval Strategy + +OpenSearch provides several data retrival APIs that are optimized for different use cases. + +At this time, SQL plugin uses simple search API and scroll API. + +Simple retrieval API returns at most `max_result_window` number of documents. `max_result_window` is an index setting. + +Scroll API requests returns all documents but can incur high memory costs on OpenSearch coordination node. + +Efficient implementation of pagination needs to be aware of retrival API used. Each retrieval strategy will be considered separately. + +The discussion below uses *under max_result_window* to refer to scenarios that can be implemented with simple retrieval API and *over max_result_window* for scenarios that require scroll API to implement. + +## SQL Node Load Balancing +V2 SQL engine supports *sql node load balancing* -- a cursor request can be routed to any SQL node in a cluster. This is achieved by encoding all data necessary to retrieve the next page in the `cursor_id`. + +## Design Diagrams +New code workflows are highlighted. + +### First page +```mermaid +sequenceDiagram + participant SQLService + participant QueryPlanFactory + participant ResponseListener + participant ResponseFormatter + participant CanPaginateVisitor + participant QueryService + participant Planner + participant CreatePagingTableScanBuilder + participant OpenSearchExecutionEngine + participant PlanSerializer + participant Physical Plan Tree + +SQLService->>QueryPlanFactory:execute + critical + QueryPlanFactory->>CanPaginateVisitor:canConvertToCursor + CanPaginateVisitor-->>QueryPlanFactory:true + end + QueryPlanFactory->>QueryService:execute + QueryService->>Planner:optimize + critical + Planner->>CreatePagingTableScanBuilder:apply + CreatePagingTableScanBuilder-->>QueryService:paged index scan + end + QueryService->>OpenSearchExecutionEngine:execute + Note over OpenSearchExecutionEngine: iterate result set + critical Serialization + OpenSearchExecutionEngine->>PlanSerializer:convertToCursor + PlanSerializer-->>OpenSearchExecutionEngine:cursor + end + critical + OpenSearchExecutionEngine->>Physical Plan Tree:getTotalHits + Physical Plan Tree-->>OpenSearchExecutionEngine:total hits + end + OpenSearchExecutionEngine-->>ResponseListener:QueryResponse + ResponseListener->>ResponseFormatter:format with cursor +``` + +### Second page +```mermaid +sequenceDiagram + participant SQLService + participant QueryPlanFactory + participant ResponseListener + participant ResponseFormatter + participant QueryService + participant OpenSearchExecutionEngine + participant PlanSerializer + participant Physical Plan Tree + +SQLService->>QueryPlanFactory:execute + QueryPlanFactory->>QueryService:execute + critical Deserialization + QueryService->>PlanSerializer:convertToPlan + PlanSerializer-->>QueryService:Physical plan tree + end + QueryService->>OpenSearchExecutionEngine:execute + Note over OpenSearchExecutionEngine: iterate result set + critical Serialization + OpenSearchExecutionEngine->>PlanSerializer:convertToCursor + PlanSerializer-->>OpenSearchExecutionEngine:cursor + end + critical + OpenSearchExecutionEngine->>Physical Plan Tree:getTotalHits + Physical Plan Tree-->>OpenSearchExecutionEngine:total hits + end + OpenSearchExecutionEngine-->>ResponseListener:QueryResponse + ResponseListener->>ResponseFormatter:format with cursor +``` +### Legacy Engine Fallback +```mermaid +sequenceDiagram + participant RestSQLQueryAction + participant Legacy Engine + participant SQLService + participant QueryPlanFactory + participant CanPaginateVisitor + +RestSQLQueryAction->>SQLService:prepareRequest + SQLService->>QueryPlanFactory:execute + critical V2 support check + QueryPlanFactory->>CanPaginateVisitor:createContinuePaginatedPlan + CanPaginateVisitor-->>QueryPlanFactory:false + QueryPlanFactory-->>RestSQLQueryAction:UnsupportedCursorRequestException + end + RestSQLQueryAction->>Legacy Engine:accept +``` + +### Serialization +```mermaid +sequenceDiagram + participant PlanSerializer + participant ProjectOperator + participant ResourceMonitorPlan + participant OpenSearchPagedIndexScan + participant OpenSearchScrollRequest + participant ContinuePageRequest + +PlanSerializer->>ProjectOperator:getPlanForSerialization + ProjectOperator-->>PlanSerializer:this +PlanSerializer->>ProjectOperator:serialize + Note over ProjectOperator: dump private fields + ProjectOperator->>ResourceMonitorPlan:getPlanForSerialization + ResourceMonitorPlan-->>ProjectOperator:delegate + ProjectOperator->>OpenSearchPagedIndexScan:serialize + Note over OpenSearchPagedIndexScan: dump private fields + alt First page + OpenSearchPagedIndexScan->>OpenSearchScrollRequest:toCursor + OpenSearchScrollRequest-->>OpenSearchPagedIndexScan:scroll ID + else Subsequent page + OpenSearchPagedIndexScan->>ContinuePageRequest:toCursor + ContinuePageRequest-->>OpenSearchPagedIndexScan:scroll ID + end + Note over ResourceMonitorPlan: ResourceMonitorPlan
is not serialized + OpenSearchPagedIndexScan-->>ProjectOperator:serialized + ProjectOperator-->>PlanSerializer:serialized +Note over PlanSerializer: Zip to reduce size +``` + +### Deserialization +```mermaid +sequenceDiagram + participant PlanSerializer + participant Deserialization Stream + participant ProjectOperator + participant OpenSearchPagedIndexScan + participant ContinuePageRequest + +Note over PlanSerializer: Unzip +PlanSerializer->>Deserialization Stream:deserialize + Deserialization Stream->>ProjectOperator:create new + Note over ProjectOperator: load private fields + ProjectOperator-->>Deserialization Stream:deserialize input + Deserialization Stream->>OpenSearchPagedIndexScan:create new + OpenSearchPagedIndexScan-->>Deserialization Stream:resolve engine + Deserialization Stream->>OpenSearchPagedIndexScan:OpenSearchStorageEngine + Note over OpenSearchPagedIndexScan: load private fields + OpenSearchPagedIndexScan->>ContinuePageRequest:create new + ContinuePageRequest-->>OpenSearchPagedIndexScan:created + OpenSearchPagedIndexScan-->>ProjectOperator:deserialized + ProjectOperator-->>PlanSerializer:deserialized +``` + +### Total Hits + +Total Hits is the number of rows matching the search criteria; with `select *` queries it is equal to row (doc) number in the table (index). +Example: +Paging thru `SELECT * FROM calcs` (17 rows) with `fetch_size = 5` returns: + +* Page 1: total hits = 17, result size = 5, cursor +* Page 2: total hits = 17, result size = 5, cursor +* Page 3: total hits = 17, result size = 5, cursor +* Page 4: total hits = 17, result size = 2, cursor +* Page 5: total hits = 0, result size = 0 + +Default implementation of `getTotalHits` in a Physical Plan iterate child plans down the tree and gets the maximum value or 0. + +```mermaid +sequenceDiagram + participant OpenSearchExecutionEngine + participant ProjectOperator + participant ResourceMonitorPlan + participant OpenSearchPagedIndexScan + +OpenSearchExecutionEngine->>ProjectOperator:getTotalHits + Note over ProjectOperator: default implementation + ProjectOperator->>ResourceMonitorPlan:getTotalHits + Note over ResourceMonitorPlan: call to delegate + ResourceMonitorPlan->>OpenSearchPagedIndexScan:getTotalHits + Note over OpenSearchPagedIndexScan: use stored value from the search response + OpenSearchPagedIndexScan-->>ResourceMonitorPlan:value + ResourceMonitorPlan-->>ProjectOperator:value + ProjectOperator-->>OpenSearchExecutionEngine:value +``` From 7dd445cde04d7a20ab1b8ea6c950492e17ddc171 Mon Sep 17 00:00:00 2001 From: Yury-Fridlyand Date: Wed, 19 Apr 2023 15:47:04 -0700 Subject: [PATCH 16/17] Update design graphs. Signed-off-by: Yury-Fridlyand --- docs/dev/Pagination-v2.md | 146 ++++++++++++++++++++------------------ 1 file changed, 76 insertions(+), 70 deletions(-) diff --git a/docs/dev/Pagination-v2.md b/docs/dev/Pagination-v2.md index 0541caf485b..6e2f3f36d88 100644 --- a/docs/dev/Pagination-v2.md +++ b/docs/dev/Pagination-v2.md @@ -15,7 +15,7 @@ https://user-images.githubusercontent.com/88679692/224208630-8d38d833-abf8-4035- # REST API ## Initial Query Request -``` +```json POST /_plugins/_sql { "query" : "...", @@ -24,12 +24,12 @@ POST /_plugins/_sql ``` Response: -``` +```json { "cursor": /* cursor_id */, "datarows": [ // ... - ], + ], "schema" : [ // ... ] @@ -44,7 +44,7 @@ The client receives an (error response](#error-response) if: - evaluating `query` results in a server-side error. ## Next Page Request -``` +```json POST /_plugins/_sql { "cursor": "" @@ -69,7 +69,7 @@ The client will receive an [error response](#error-response) if it sends a curso The client will receive an error response if any of the above REST calls result in an server-side error. The response object has the following format: -```json5 +```json { "error": { "details": , @@ -108,8 +108,6 @@ New code workflows are highlighted. sequenceDiagram participant SQLService participant QueryPlanFactory - participant ResponseListener - participant ResponseFormatter participant CanPaginateVisitor participant QueryService participant Planner @@ -118,29 +116,31 @@ sequenceDiagram participant PlanSerializer participant Physical Plan Tree -SQLService->>QueryPlanFactory:execute +SQLService->>+QueryPlanFactory: execute critical - QueryPlanFactory->>CanPaginateVisitor:canConvertToCursor - CanPaginateVisitor-->>QueryPlanFactory:true + QueryPlanFactory->>+CanPaginateVisitor: canConvertToCursor + CanPaginateVisitor-->>-QueryPlanFactory: true end - QueryPlanFactory->>QueryService:execute - QueryService->>Planner:optimize + QueryPlanFactory->>+QueryService: execute + QueryService->>+Planner: optimize critical - Planner->>CreatePagingTableScanBuilder:apply - CreatePagingTableScanBuilder-->>QueryService:paged index scan + Planner->>+CreatePagingTableScanBuilder: apply + CreatePagingTableScanBuilder-->>-Planner: paged index scan end - QueryService->>OpenSearchExecutionEngine:execute + Planner-->>-QueryService: Logical Plan Tree + QueryService->>+OpenSearchExecutionEngine: execute Note over OpenSearchExecutionEngine: iterate result set critical Serialization - OpenSearchExecutionEngine->>PlanSerializer:convertToCursor - PlanSerializer-->>OpenSearchExecutionEngine:cursor + OpenSearchExecutionEngine->>+PlanSerializer: convertToCursor + PlanSerializer-->>-OpenSearchExecutionEngine: cursor end critical - OpenSearchExecutionEngine->>Physical Plan Tree:getTotalHits - Physical Plan Tree-->>OpenSearchExecutionEngine:total hits + OpenSearchExecutionEngine->>+Physical Plan Tree: getTotalHits + Physical Plan Tree-->>-OpenSearchExecutionEngine: total hits end - OpenSearchExecutionEngine-->>ResponseListener:QueryResponse - ResponseListener->>ResponseFormatter:format with cursor + OpenSearchExecutionEngine-->>-QueryService: execution completed + QueryService-->>-QueryPlanFactory: execution completed + QueryPlanFactory-->>-SQLService: execution completed ``` ### Second page @@ -148,31 +148,31 @@ SQLService->>QueryPlanFactory:execute sequenceDiagram participant SQLService participant QueryPlanFactory - participant ResponseListener - participant ResponseFormatter participant QueryService participant OpenSearchExecutionEngine participant PlanSerializer participant Physical Plan Tree -SQLService->>QueryPlanFactory:execute - QueryPlanFactory->>QueryService:execute +SQLService->>+QueryPlanFactory: execute + QueryPlanFactory->>+QueryService: execute critical Deserialization - QueryService->>PlanSerializer:convertToPlan - PlanSerializer-->>QueryService:Physical plan tree + QueryService->>+PlanSerializer: convertToPlan + PlanSerializer-->>-QueryService: Physical plan tree end - QueryService->>OpenSearchExecutionEngine:execute + Note over QueryService: Planner, Optimizer and Implementor
are skipped + QueryService->>+OpenSearchExecutionEngine: execute Note over OpenSearchExecutionEngine: iterate result set critical Serialization - OpenSearchExecutionEngine->>PlanSerializer:convertToCursor - PlanSerializer-->>OpenSearchExecutionEngine:cursor + OpenSearchExecutionEngine->>+PlanSerializer: convertToCursor + PlanSerializer-->>-OpenSearchExecutionEngine: cursor end critical - OpenSearchExecutionEngine->>Physical Plan Tree:getTotalHits - Physical Plan Tree-->>OpenSearchExecutionEngine:total hits + OpenSearchExecutionEngine->>+Physical Plan Tree: getTotalHits + Physical Plan Tree-->>-OpenSearchExecutionEngine: total hits end - OpenSearchExecutionEngine-->>ResponseListener:QueryResponse - ResponseListener->>ResponseFormatter:format with cursor + OpenSearchExecutionEngine-->>-QueryService: execution completed + QueryService-->>-QueryPlanFactory: execution completed + QueryPlanFactory-->>-SQLService: execution completed ``` ### Legacy Engine Fallback ```mermaid @@ -183,14 +183,17 @@ sequenceDiagram participant QueryPlanFactory participant CanPaginateVisitor -RestSQLQueryAction->>SQLService:prepareRequest - SQLService->>QueryPlanFactory:execute +RestSQLQueryAction->>+SQLService: prepareRequest + SQLService->>+QueryPlanFactory: execute critical V2 support check - QueryPlanFactory->>CanPaginateVisitor:createContinuePaginatedPlan - CanPaginateVisitor-->>QueryPlanFactory:false - QueryPlanFactory-->>RestSQLQueryAction:UnsupportedCursorRequestException + QueryPlanFactory->>+CanPaginateVisitor: canConvertToCursor + CanPaginateVisitor-->>-QueryPlanFactory: false + QueryPlanFactory-->>-RestSQLQueryAction: UnsupportedCursorRequestException + deactivate SQLService end - RestSQLQueryAction->>Legacy Engine:accept + RestSQLQueryAction->>Legacy Engine: accept + Note over Legacy Engine: Processing in Legacy engine + Legacy Engine-->>RestSQLQueryAction:complete ``` ### Serialization @@ -203,24 +206,24 @@ sequenceDiagram participant OpenSearchScrollRequest participant ContinuePageRequest -PlanSerializer->>ProjectOperator:getPlanForSerialization - ProjectOperator-->>PlanSerializer:this -PlanSerializer->>ProjectOperator:serialize +PlanSerializer->>+ProjectOperator: getPlanForSerialization + ProjectOperator-->>-PlanSerializer: this +PlanSerializer->>+ProjectOperator: serialize Note over ProjectOperator: dump private fields - ProjectOperator->>ResourceMonitorPlan:getPlanForSerialization - ResourceMonitorPlan-->>ProjectOperator:delegate - ProjectOperator->>OpenSearchPagedIndexScan:serialize - Note over OpenSearchPagedIndexScan: dump private fields + ProjectOperator->>+ResourceMonitorPlan: getPlanForSerialization + ResourceMonitorPlan-->>-ProjectOperator: delegate + Note over ResourceMonitorPlan: ResourceMonitorPlan
is not serialized + ProjectOperator->>+OpenSearchPagedIndexScan: serialize alt First page - OpenSearchPagedIndexScan->>OpenSearchScrollRequest:toCursor - OpenSearchScrollRequest-->>OpenSearchPagedIndexScan:scroll ID + OpenSearchPagedIndexScan->>+OpenSearchScrollRequest: toCursor + OpenSearchScrollRequest-->>-OpenSearchPagedIndexScan: scroll ID else Subsequent page - OpenSearchPagedIndexScan->>ContinuePageRequest:toCursor - ContinuePageRequest-->>OpenSearchPagedIndexScan:scroll ID + OpenSearchPagedIndexScan->>+ContinuePageRequest: toCursor + ContinuePageRequest-->>-OpenSearchPagedIndexScan: scroll ID end - Note over ResourceMonitorPlan: ResourceMonitorPlan
is not serialized - OpenSearchPagedIndexScan-->>ProjectOperator:serialized - ProjectOperator-->>PlanSerializer:serialized + Note over OpenSearchPagedIndexScan: dump private fields + OpenSearchPagedIndexScan-->>-ProjectOperator: serialized + ProjectOperator-->>-PlanSerializer: serialized Note over PlanSerializer: Zip to reduce size ``` @@ -234,18 +237,21 @@ sequenceDiagram participant ContinuePageRequest Note over PlanSerializer: Unzip -PlanSerializer->>Deserialization Stream:deserialize - Deserialization Stream->>ProjectOperator:create new +PlanSerializer->>+Deserialization Stream: deserialize + Deserialization Stream->>+ProjectOperator: create new Note over ProjectOperator: load private fields - ProjectOperator-->>Deserialization Stream:deserialize input - Deserialization Stream->>OpenSearchPagedIndexScan:create new - OpenSearchPagedIndexScan-->>Deserialization Stream:resolve engine - Deserialization Stream->>OpenSearchPagedIndexScan:OpenSearchStorageEngine + ProjectOperator-->>Deserialization Stream: deserialize input + activate Deserialization Stream + Deserialization Stream->>+OpenSearchPagedIndexScan: create new + deactivate Deserialization Stream + OpenSearchPagedIndexScan-->>+Deserialization Stream: resolve engine + Deserialization Stream->>-OpenSearchPagedIndexScan: OpenSearchStorageEngine Note over OpenSearchPagedIndexScan: load private fields - OpenSearchPagedIndexScan->>ContinuePageRequest:create new - ContinuePageRequest-->>OpenSearchPagedIndexScan:created - OpenSearchPagedIndexScan-->>ProjectOperator:deserialized - ProjectOperator-->>PlanSerializer:deserialized + OpenSearchPagedIndexScan->>+ContinuePageRequest: create new + ContinuePageRequest-->>-OpenSearchPagedIndexScan: created + OpenSearchPagedIndexScan-->>-ProjectOperator: deserialized + ProjectOperator-->>-PlanSerializer: deserialized + deactivate Deserialization Stream ``` ### Total Hits @@ -269,13 +275,13 @@ sequenceDiagram participant ResourceMonitorPlan participant OpenSearchPagedIndexScan -OpenSearchExecutionEngine->>ProjectOperator:getTotalHits +OpenSearchExecutionEngine->>+ProjectOperator: getTotalHits Note over ProjectOperator: default implementation - ProjectOperator->>ResourceMonitorPlan:getTotalHits + ProjectOperator->>+ResourceMonitorPlan: getTotalHits Note over ResourceMonitorPlan: call to delegate - ResourceMonitorPlan->>OpenSearchPagedIndexScan:getTotalHits + ResourceMonitorPlan->>+OpenSearchPagedIndexScan: getTotalHits Note over OpenSearchPagedIndexScan: use stored value from the search response - OpenSearchPagedIndexScan-->>ResourceMonitorPlan:value - ResourceMonitorPlan-->>ProjectOperator:value - ProjectOperator-->>OpenSearchExecutionEngine:value + OpenSearchPagedIndexScan-->>-ResourceMonitorPlan: value + ResourceMonitorPlan-->>-ProjectOperator: value + ProjectOperator-->>-OpenSearchExecutionEngine: value ``` From f9ae48410b67087b9eadee2c1899ac1c17922f26 Mon Sep 17 00:00:00 2001 From: MaxKsyunz Date: Mon, 24 Apr 2023 18:10:49 -0700 Subject: [PATCH 17/17] More fixes for merge from upstream/main. Signed-off-by: MaxKsyunz --- .../sql/opensearch/executor/protector/ResourceMonitorPlan.java | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java index 5f356cf2741..0ec4d743b31 100644 --- a/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java +++ b/opensearch/src/main/java/org/opensearch/sql/opensearch/executor/protector/ResourceMonitorPlan.java @@ -23,8 +23,9 @@ * A PhysicalPlan which will run the delegate plan in resource protection manner. */ @ToString +@RequiredArgsConstructor @EqualsAndHashCode(callSuper = false) -public class ResourceMonitorPlan extends PhysicalPlan { +public class ResourceMonitorPlan extends PhysicalPlan implements SerializablePlan { /** * How many method calls to delegate's next() to perform resource check once.