Skip to content

Conversation

@asifabashar
Copy link
Contributor

@asifabashar asifabashar commented Jan 27, 2026

Description

Problem Statement
Currently there is no fieldformat comamnd in OpenSearch SQL plugin that exists in Splunk SPL.

Current State
The Roadmap [#4287 ] provides list of commands that are not implemented.

PPL Comamnd fieldformat:

fieldformat command can use an to set the format of a field value. Because commands that come later in the search pipeline will be impacted by format change, fieldformat command should be used later in PPL pipeline.

Syntax

fieldformat <field>=<eval-expression>[,<field>=<eval-expression>]
Required arguments: <field>
Description: The name of a new or existing field which is not wildcarded. The output will contain the field with value of the eval expression.

<eval-expression>: Syntax: <string>
Description: A combination of values, variables, operators, and functions that represent the value of your output field. Only one with the fieldformat command can be specified. To change mulitple formats, multiple fieldformat commands need to be used.

Example:
Original data:

os>source=accounts
| fields account_number, firstname, balance

fetched rows / total rows = 4/4

account_number firstname balance
1 Amber 39225
6 Hattie 5686
13 Nanette 32838

Formatting changes with fieldformat command:

os>source=accounts
| fieldformat balance = tostring(balance, "commas")
| fields account_number, firstname, balance

fetched rows / total rows = 4/4

account_number firstname balance
1 Amber 39,225
6 Hattie 5,686
13 Nanette 32,838

Concatenation Operation with dot:
To concatenate a string specify the prefix in quotes followed by period character as concatenation operator, followed by a function that returns string. Following example displays commas in the currency values with currency symbol.
...| fieldformat balance="$".tostring(balance,"commas")

Approach
Implement the plan in existing visitLet in CalciteRelNodeVisitor and store the parameters in Eval class. The command will be just an alias of eval command.
Only exception to eval is to have an additional functionality where string in quotes followed by period character will behave as concatenation operator and it expects the next part after dot is string type. To accomodate that, we need to enhance Let class to have prefix and suffix.

Functions that can be used are those that work with eval commands:

Date time functions
Condition functions
Mathematical functions
JSON functions
IP address functions
String functions
Statistical functions
System functions

https://github.com/opensearch-project/sql/tree/main/docs/user/ppl/functions.

Related Issues

Resolves #5021

Check List

  • New functionality includes testing.
  • New functionality has been documented.
  • New functionality has javadoc added.
  • New functionality has a user manual doc added.
  • New PPL command checklist all confirmed.
  • API changes companion pull request created.
  • Commits are signed per the DCO using --signoff or -s.
  • Public documentation issue/PR created.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 27, 2026

📝 Walkthrough

Summary by CodeRabbit

  • New Features

    • Added a new fieldformat command to format field values with optional prefix/suffix string concatenation.
  • Documentation

    • Added comprehensive fieldformat docs with syntax, parameters, and examples.
  • Tests

    • Added extensive unit and integration tests and expected outputs to validate fieldformat parsing, execution, explain plans, and cross-cluster behavior.

✏️ Tip: You can customize this high-level summary in your review settings.

Walkthrough

Adds a new PPL fieldformat command (alias to eval) with optional prefix/suffix string concatenation, updates parser/AST visitors, extends Let to hold concat literals, applies CONCAT in Calcite visitor, adds docs and extensive unit/integration tests. FieldResolutionVisitor contains duplicate overrides for the new hook.

Changes

Cohort / File(s) Summary
Parser & Grammar
ppl/src/main/antlr/OpenSearchPPLLexer.g4, ppl/src/main/antlr/OpenSearchPPLParser.g4
Add FIELDFORMAT token, fieldformatCommand, fieldFormatEvalClause, and ffLogicalExpression rules to support fieldformat syntax and dot-operator concatenation.
PPL AST Builder
ppl/src/main/java/.../AstBuilder.java, ppl/src/main/java/.../AstExpressionBuilder.java
Add visitFieldformatCommand and visitFieldFormatEvalClause to construct Eval/Let nodes with optional concatPrefix/concatSuffix literals.
Core AST Visitors
core/src/main/java/org/opensearch/sql/ast/AbstractNodeVisitor.java, core/src/main/java/org/opensearch/sql/ast/analysis/FieldResolutionVisitor.java, core/src/main/java/org/opensearch/sql/analysis/Analyzer.java
Introduce visitFieldFormat hooks across AST/analysis layers. Note: FieldResolutionVisitor now contains duplicate visitFieldFormat overrides. Analyzer.visitFieldFormat currently throws Calcite-only exception.
AST Expression Model
core/src/main/java/org/opensearch/sql/ast/expression/Let.java
Extend Let with concatPrefix and concatSuffix fields and a new constructor to carry optional concatenation literals.
Calcite Expression Handling
core/src/main/java/org/opensearch/sql/calcite/CalciteRexNodeVisitor.java
In visitLet, wrap Let expression with CONCAT operations when concatPrefix/concatSuffix are present.
Documentation
docs/category.json, docs/user/ppl/cmd/fieldformat.md
Add fieldformat docs and register doc in category.json with syntax, examples, and concat behavior.
PPL Unit Tests
ppl/src/test/java/.../CalcitePPLFieldFormatTest.java
New Calcite PPL unit test class with many tests covering fieldformat plans, results, concat behavior, and error cases.
Integration Tests & Resources
integ-test/src/test/java/.../CalciteFieldFormatCommandIT.java, integ-test/src/test/java/.../CalciteExplainIT.java, integ-test/src/test/java/.../CalciteNoPushdownIT.java, integ-test/src/test/java/.../NewAddedCommandsIT.java, integ-test/src/test/java/.../CrossClusterSearchIT.java, integ-test/src/test/resources/.../explain_field_format.yaml
Add new integration test class, update suites, add explain test and expected YAML, add cross-cluster test and a PPL-level test; include new resource for expected explain output.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Parser as "PPL Parser"
    participant AstBuilder as "AST Builder"
    participant ExprBuilder as "Expr Builder"
    participant Analyzer
    participant CalciteVisitor as "Calcite Rex Visitor"
    participant Executor

    User->>Parser: submit fieldformat query
    Parser->>Parser: recognize FIELDFORMAT token
    Parser->>AstBuilder: visitFieldformatCommand(ctx)
    AstBuilder->>ExprBuilder: visitFieldFormatEvalClause(ctx)
    ExprBuilder->>AstBuilder: return Let nodes (with concatPrefix/concatSuffix)
    AstBuilder->>Analyzer: return Eval UnresolvedPlan
    Analyzer->>CalciteVisitor: analyze Eval/Let nodes
    CalciteVisitor->>CalciteVisitor: visitLet -> apply CONCAT(prefix, expr, suffix)
    CalciteVisitor->>Executor: produce executable plan
    Executor->>User: return formatted results
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Suggested labels

PPL, feature, calcite, enhancement

Suggested reviewers

  • penghuo
  • LantaoJin
  • ykmr1224
  • yuancu
  • anirudha
  • kavithacm
  • ps48
  • joshuali925
  • derek-ho
  • GumpacG
🚥 Pre-merge checks | ✅ 4 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 19.48% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title '[Feature] fieldformat command implementation' clearly and concisely describes the main change—adding a new fieldformat PPL command—which aligns with the substantial feature implementation visible in the code changes.
Description check ✅ Passed The description is detailed and directly related to the changeset, explaining the problem statement, syntax, approach, examples, and related issue #5021, all of which correspond to the implementation changes in the PR.
Linked Issues check ✅ Passed The code changes fully implement the requirements from issue #5021: the fieldformat command is added as an eval alias, the dot concatenation operator is implemented via Let prefix/suffix fields, parser grammar rules are extended, visitor methods are added, integration tests validate the functionality, and documentation is provided.
Out of Scope Changes check ✅ Passed All changes directly support fieldformat implementation: lexer/parser enhancements, AST and visitor updates, Let class modifications to store prefix/suffix, Calcite integration, comprehensive test coverage, and documentation. No extraneous modifications are present.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🤖 Fix all issues with AI agents
In `@core/src/main/java/org/opensearch/sql/analysis/Analyzer.java`:
- Around line 528-532: The JavaDoc for visitFieldFormat(Eval node,
AnalysisContext context) is misleading because it says "Build {`@link`
LogicalEval}" while the method always throws
getOnlyForCalciteException("fieldformat"); update the JavaDoc to reflect that
this is a Calcite-only command and that the method intentionally throws an
exception (e.g., "Throws UnsupportedOperationException via
getOnlyForCalciteException for Calcite-only fieldformat"), referencing the
method name visitFieldFormat and the thrown helper getOnlyForCalciteException to
make the behavior clear.

In `@docs/user/ppl/cmd/fieldformat.md`:
- Around line 12-15: The docs for the fieldformat command advertise
comma-separated multiple assignments but this contradicts the "one expression
per command" rule; update the fieldformat documentation to show the
single-expression syntax (e.g. fieldformat
<field>=[(prefix).]<expression>[.(suffix)]) and remove the comma-separated
example, or alternatively update the parser implementation that handles the
fieldformat command to accept and parse comma-separated multiple assignments;
locate references to the fieldformat syntax in docs/user/ppl/cmd/fieldformat.md
and the parser/handler that recognizes the fieldformat command and make the
change consistently (either remove comma support in docs and tests, or add
comma-splitting logic to the fieldformat parsing routine).

In
`@integ-test/src/test/java/org/opensearch/sql/calcite/remote/CalciteExplainIT.java`:
- Around line 2489-2497: The test testFieldFormatExplain currently calls
explainQueryYaml with an incomplete "fieldformat" clause; update the
explainQueryYaml invocation to use a valid fieldformat expression (e.g., include
a target field and format expression) so the explain plan actually covers the
feature, and update the expected YAML loaded by
loadExpectedPlan("explain_field_format.yaml") and any comment text to match the
new valid query; verify assertYamlEqualsIgnoreId still compares the updated
expected plan against explainQueryYaml.

In `@integ-test/src/test/java/org/opensearch/sql/ppl/NewAddedCommandsIT.java`:
- Around line 205-216: Extend the testFieldFormatCommand (in class
NewAddedCommandsIT) to include assertions that validate output and failure
modes: after executing the query via executeQuery/parse JSONObject (same pattern
used now), assert that for multi-document results each document contains the new
field "double_balance" with the expected values (balance*2) for a variety of
records; add a case where the source field is missing/null and assert the
"double_balance" is absent or null; add a boundary-value case (zero, large long,
negative) by either querying specific seeded docs or inserting temporary docs
and asserting computed results; and add an invalid-expression case (e.g.
"fieldformat x = not_a_field ++") that expects a ResponseException and assert
the error message indicates a parse/evaluation error. Use existing helpers
(executeQuery, TestUtils.getResponseBody, verifyQuery) and create small
additional assertions (JSONObject accesses and asserts) rather than only
checking that the command runs.

In
`@integ-test/src/test/java/org/opensearch/sql/security/CrossClusterSearchIT.java`:
- Around line 275-288: Update the misleading JavaDoc comment above the
testCrossClusterFieldFormat() method in class CrossClusterSearchIT to accurately
describe the test's purpose (it verifies the fieldformat command across a remote
cluster), e.g., change "Test query_string without fields parameter on remote
cluster" to something like "Test fieldformat command for cross-cluster search
(formats balance on remote index)"; ensure the comment sits immediately above
the `@Test` method testCrossClusterFieldFormat and keep the existing method
signature and assertions unchanged.

In
`@ppl/src/test/java/org/opensearch/sql/ppl/calcite/CalcitePPLFieldFormatTest.java`:
- Around line 303-305: Remove the leftover debug marker comment "// #" found
near the RelNode construction in CalcitePPLFieldFormatTest; specifically delete
the stray comment between the string concatenation and the call to
getRelNode(ppl) so the test code reads cleanly (keep the surrounding string and
the RelNode root = getRelNode(ppl) statement unchanged).
- Around line 268-270: In CalcitePPLFieldFormatTest remove the stray debug
marker by deleting the lone comment "// #" that appears just above the call to
RelNode root = getRelNode(ppl); (in the test method in class
CalcitePPLFieldFormatTest) so the test source is clean and contains only the
intended comment/strings and the RelNode root = getRelNode(ppl); line.
🧹 Nitpick comments (3)
core/src/main/java/org/opensearch/sql/ast/expression/Let.java (1)

26-49: Consider extracting duplicated validation logic.

The metadata field validation is duplicated across both constructors. Consider using a private helper method or constructor chaining to reduce duplication.

Suggested refactor using constructor chaining
   public Let(Field var, UnresolvedExpression expression) {
-    String varName = var.getField().toString();
-    if (OpenSearchConstants.METADATAFIELD_TYPE_MAP.containsKey(varName)) {
-      throw new IllegalArgumentException(
-          String.format("Cannot use metadata field [%s] as the eval field.", varName));
-    }
-    this.var = var;
-    this.expression = expression;
-    this.concatPrefix = null;
-    this.concatSuffix = null;
+    this(var, expression, null, null);
   }

   public Let(
       Field var, UnresolvedExpression expression, Literal concatPrefix, Literal concatSuffix) {
+    validateVar(var);
+    this.var = var;
+    this.expression = expression;
+    this.concatPrefix = concatPrefix;
+    this.concatSuffix = concatSuffix;
+  }
+
+  private static void validateVar(Field var) {
     String varName = var.getField().toString();
     if (OpenSearchConstants.METADATAFIELD_TYPE_MAP.containsKey(varName)) {
       throw new IllegalArgumentException(
           String.format("Cannot use metadata field [%s] as the eval field.", varName));
     }
-    this.var = var;
-    this.expression = expression;
-    this.concatPrefix = concatPrefix;
-    this.concatSuffix = concatSuffix;
   }
integ-test/src/test/java/org/opensearch/sql/calcite/CalciteNoPushdownIT.java (1)

110-112: Consider maintaining alphabetical ordering in the test suite.

The test class list appears to be alphabetically ordered, but CalciteFieldFormatCommandIT.class is added at the end rather than in its proper alphabetical position (after CalciteFieldsCommandIT.class around line 37-38). This doesn't affect functionality but impacts maintainability.

♻️ Suggested reordering

Move CalciteFieldFormatCommandIT.class to its alphabetical position:

   CalciteFieldsCommandIT.class,
+  CalciteFieldFormatCommandIT.class,
   CalciteFillNullCommandIT.class,

And keep CalcitePPLTpchIT.class at the end if it's intentionally last:

-  CalcitePPLTpchIT.class,
-  CalciteFieldFormatCommandIT.class
+  CalcitePPLTpchIT.class
integ-test/src/test/java/org/opensearch/sql/calcite/remote/CalciteFieldFormatCommandIT.java (1)

16-37: Consider adding test resource cleanup.

The test_eval index is created in init() but there's no cleanup after tests. This could leave test data that affects subsequent test runs or pollutes the test environment.

🧹 Suggested cleanup

Add an @After or @AfterClass method to clean up the test index:

`@After`
public void cleanup() throws IOException {
  try {
    Request deleteRequest = new Request("DELETE", "/test_eval");
    client().performRequest(deleteRequest);
  } catch (Exception e) {
    // Index may not exist, ignore
  }
}

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 17

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
docs/user/ppl/index.md (1)

38-84: Add a fieldformat entry to the command index.

The PR introduces the fieldformat command, but it’s missing from the command list table. Please add it with the correct version/status so users can discover it from the index.

core/src/main/java/org/opensearch/sql/ast/analysis/FieldResolutionResult.java (1)

61-86: Short‑circuit wildcard lists containing *.

If patterns contains "*", createOrWildcard currently builds an OrWildcard, which makes hasPartialWildcards() return true and can misclassify a full wildcard as partial. Consider returning ANY_WILDCARD when "*" is present.

🛠️ Suggested fix
   private static Wildcard createOrWildcard(Collection<String> patterns) {
     if (patterns == null || patterns.isEmpty()) {
       return NULL_WILDCARD;
     }
+    if (patterns.stream().anyMatch("*"::equals)) {
+      return ANY_WILDCARD;
+    }
     if (patterns.size() == 1) {
       return getWildcard(patterns.iterator().next());
     }
🤖 Fix all issues with AI agents
In
`@core/src/main/java/org/opensearch/sql/ast/analysis/FieldResolutionVisitor.java`:
- Around line 628-632: visitFieldFormat in FieldResolutionVisitor currently only
calls visitChildren, causing computed output fields to be treated as required
inputs and missing the expression's actual input fields; update
visitFieldFormat(Eval node, FieldResolutionContext context) to delegate to the
same logic as visitEval (i.e., call or reuse visitEval(node, context)) so that
field-requirement and computed-output handling mirror Eval's behavior and
downstream field resolution works correctly.
- Around line 580-584: Add unit tests to FieldResolutionVisitorTest that
exercise FieldResolutionVisitor.visitTranspose(Transpose,
FieldResolutionContext) and visitFieldFormat(FieldFormat,
FieldResolutionContext) to assert correct field resolution behavior: create
Transpose and FieldFormat AST nodes, run them through FieldResolutionVisitor
with a constructed FieldResolutionContext, and verify which fields are accessed
and which are propagated in the context (similar to the existing
testMultisearch()); target the FieldResolutionVisitor and FieldResolutionContext
classes and assert expected context state before and after calling
visitTranspose and visitFieldFormat.

In `@core/src/main/java/org/opensearch/sql/ast/tree/FillNull.java`:
- Around line 66-68: Add JavaDoc for the new public method isAgainstAllFields in
class FillNull explaining what it checks and include an `@return` describing the
boolean semantics; then add unit tests covering both true and false cases (e.g.,
when replacementForAll is non-empty and getReplacementPairs() is empty, and vice
versa) to exercise the predicate. Locate the method isAgainstAllFields and
update its JavaDoc block, and add tests in the corresponding FillNullTest (or
create one) that manipulate replacementForAll and getReplacementPairs() to
assert expected outcomes.

In `@core/src/main/java/org/opensearch/sql/ast/tree/Transpose.java`:
- Around line 34-39: In Transpose, the NumberFormatException in the tempMaxRows
parsing silently swallows errors; update the catch in the block that parses
arguments.get("number").getValue() so it logs a warning with the invalid input
and exception details (include the raw value and e) and optionally falls back to
the default, or alternatively rethrow a meaningful
IllegalArgumentException—ensure the log or exception references
tempMaxRows/arguments.get("number") so it’s clear which value failed.

In `@core/src/main/java/org/opensearch/sql/calcite/CalciteRelNodeVisitor.java`:
- Around line 1509-1513: The join-with-criteria path always calls
JoinAndLookupUtils.mergeDynamicFieldsAsNeeded(context, OverwriteMode.LEFT_WINS)
which ignores the requested overwrite behavior; update this call to honor
node.isOverwrite() by passing the appropriate OverwriteMode (e.g.,
OverwriteMode.RIGHT_WINS when node.isOverwrite() is true, otherwise
OverwriteMode.LEFT_WINS) so dynamic fields are merged according to the node's
overwrite flag; locate the call in CalciteRelNodeVisitor (the branch handling
join-with-criteria) and replace the hardcoded OverwriteMode.LEFT_WINS with a
conditional that selects the correct OverwriteMode based on node.isOverwrite().
- Around line 715-719: fieldNames currently only filters metadata fields, so the
internal dynamic-fields map (DYNAMIC_FIELDS_MAP) can slip through and be
transposed as a user column; update the stream filter where fieldNames is built
(the expression using
context.relBuilder.peek().getRowType().getFieldNames().stream()) to also exclude
the internal DYNAMIC_FIELDS_MAP identifier (e.g., add &&
!DYNAMIC_FIELDS_MAP.equals(fieldName) or a small helper like
isInternalMap(fieldName)), keeping the existing isMetadataField(fieldName) check
intact so the map column is never included in transpose inputs.

In `@core/src/main/java/org/opensearch/sql/calcite/DynamicFieldsHelper.java`:
- Around line 287-298: getFieldsAsMap currently only filters metadata via
excludeMetaFields, so the dynamic `_MAP` field can still be included; change the
filtration to use excludeSpecialFields (same filter used by getRemainingFields)
so dynamic map fields are excluded defensively. Specifically, in getFieldsAsMap
replace the call to excludeMetaFields(existingFields) (and subsequent
keys.removeAll(excluded)) with a filtered keys list produced by
excludeSpecialFields(existingFields) and then removeAll(excluded) as before,
then proceed to sort and build keysArray/valuesArray as existing code does.

In `@core/src/main/java/org/opensearch/sql/calcite/ExtendedRexBuilder.java`:
- Around line 51-60: Add full Javadoc for the new public helpers in class
ExtendedRexBuilder: update the itemCall(RexNode node, String key) and
makeCall(BuiltinFunctionName functionName, RexNode... args) method comments to
include `@param` for each parameter, a clear `@return` describing the produced
RexNode, and an `@throws` tag documenting any exceptions that can be propagated
from PPLFuncImpTable.INSTANCE.resolve (e.g., RuntimeException or a more specific
exception if known). Ensure the descriptions reference the semantics (map/array
access for itemCall and creating a function call for makeCall) and keep tags
consistent with project Javadoc style.
- Around line 51-60: Add missing JavaDoc tags to the public methods
itemCall(RexNode node, String key) and makeCall(BuiltinFunctionName
functionName, RexNode... args): include `@param` for each parameter, `@return`
describing the returned RexNode, and `@throws` for relevant runtime exceptions
(e.g., NPE for null inputs, IllegalArgumentException for invalid functionName)
and reference the PPLFuncImpTable resolve behavior and makeLiteral usage in the
description. Then add unit tests for ExtendedRexBuilder that (1) verify normal
operation of itemCall and makeCall (valid node/key and functionName), (2) assert
expected exceptions when node, key, or functionName is null or invalid, and (3)
cover edge cases such as empty string keys and varargs length zero for makeCall;
use the same helper/mock RexNode and BuiltinFunctionName fixtures already used
elsewhere and assert calls to PPLFuncImpTable.INSTANCE.resolve produce the
expected results or exceptions.

In
`@core/src/main/java/org/opensearch/sql/calcite/utils/DynamicFieldsResultProcessor.java`:
- Around line 33-38: Update the public Javadoc for the method in
DynamicFieldsResultProcessor that "Expand dynamic fields into individual columns
in QueryResponse" to include an `@throws` tag: document that the method throws
NullPointerException (or the specific runtime exception thrown) when the input
response is null, and briefly describe the condition under which that exception
is raised so it complies with core Javadoc guidelines.

In `@docs/user/ppl/cmd/transpose.md`:
- Around line 17-67: Fix typos and normalize example headings/descriptions in
the transpose docs: correct "wihtout" to "without" and "Tranpose" to
"Transpose", remove the duplicated "Example 2" numbering (make them Example 1,
Example 2, Example 3), and update the descriptions to accurately state
parameters used (Example 1: transpose with no args defaults to 5; Example 2:
transpose 4 to limit to 4 rows; Example 3: transpose 4 with a specified
first-column name). Also ensure the prose consistently references the transpose
command (| transpose vs | transpose 4) and that example captions match the code
blocks' behavior.

In
`@integ-test/src/test/java/org/opensearch/sql/calcite/remote/CalciteTransposeCommandIT.java`:
- Around line 31-171: Tests like testTranspose, testTransposeLimit,
testTransposeLowerLimit, and testTransposeColumnName call executeQuery with
queries that use head without a deterministic order, causing flaky assertions;
update each query string passed to executeQuery to insert a stable sort (e.g., "
| sort account_number | head ...") before the head call so rows are ordered
deterministically (locate the executeQuery invocations in those test methods and
modify the String.format query to include "sort account_number" before "head").

In `@integ-test/src/test/java/org/opensearch/sql/ppl/NewAddedCommandsIT.java`:
- Around line 205-214: The testTransposeCommand currently only asserts the query
executes; update it to validate transpose semantics by calling
executeQuery(String.format("search source=%s | transpose", TEST_INDEX_BANK)) and
parsing the JSONObject result (used by verifyQuery) to assert expected
transposed rows/columns for known input documents; add additional test cases
within NewAddedCommandsIT: (1) a null/empty input case that runs transpose on an
empty result set and asserts empty output, (2) boundary cases (single-row input
and wide-row input) asserting correct shape and values, (3) error conditions
(malformed transpose arguments or non-existent fields) asserting
ResponseException and expected error messages, and (4) a multi-document case
asserting concatenated/transposed output across documents. Use existing helpers
verifyQuery and executeQuery to parse responses and factor common assertions
into a small helper method to compare expected matrix shape and values.

In
`@ppl/src/test/java/org/opensearch/sql/ppl/calcite/CalcitePPLTransposeTest.java`:
- Around line 12-16: Remove the large commented-out Spark SQL block from
CalcitePPLTransposeTest (it’s dead code) and add focused unit tests in this
class extending CalcitePPLAbstractTest to cover edge cases for transpose: 1)
test null and empty string for the column_name parameter (expect either default
behavior or specific error), 2) test limit boundary values: zero, a negative
value (expect error), and an extremely large limit (e.g., Integer.MAX_VALUE or a
value that triggers truncation/overflow behavior), and 3) test invalid parameter
combinations that should raise errors (e.g., incompatible aggregate +
column_name or negative row index); implement these as separate `@Test` methods
named clearly (e.g., testTransposeNullColumnName, testTransposeEmptyColumnName,
testTransposeLimitZero, testTransposeLimitNegative, testTransposeLimitMax,
testTransposeInvalidParams) using the existing assertion helpers in
CalcitePPLAbstractTest to assert expected results or exceptions.

In
`@ppl/src/test/java/org/opensearch/sql/ppl/parser/FieldResolutionVisitorTest.java`:
- Around line 311-353: Add unit tests that cover null/empty/bad-syntax/error
cases for fillnull, replace, and spath similar to existing
testFillnullWithoutFields: for fillnull add tests calling
visitor.analyze(parse(...)) with empty/null field lists and malformed variants
to assert IllegalArgumentException (or appropriate exception) and error
messages; for replace add a test method (e.g., testReplaceErrors) that calls
visitor.analyze(parse(...)) with missing IN clause, empty target list, and
invalid syntax and asserts thrown exceptions; for spath add a test method (e.g.,
testSpathErrors) that calls visitor.analyze(parse(...)) with missing input
parameter, empty input, and malformed spath syntax and asserts exceptions; keep
naming consistent with testFillnull/testReplaceCommand/testSpathCommand and
reuse assertThrows pattern used in testFillnullWithoutFields to validate error
text and exception types.
- Around line 362-430: The tests testAppendCol and testAppendpipe in
FieldResolutionVisitorTest are missing multi-document scenarios and there are no
boundary-condition tests for append operations; add new test cases (or extend
these methods) to cover multi-relation behavior similar to
testAppend/testMultisearch by invoking assertMultiRelationFields with an
additional relation (e.g., a "sub" source) for appendcol and appendpipe, and add
separate small tests that assert behavior for boundary conditions (null fields
and empty field sets) using assertMultiRelationFields and/or
assertSingleSpathFields to verify field resolution results and spath handling;
reference the existing test methods testAppend, testMultisearch,
testAppendWithSpathInMain, assertMultiRelationFields, and
assertSingleSpathFields to mirror expected assertions and structure.

In
`@ppl/src/test/java/org/opensearch/sql/ppl/utils/PPLQueryDataAnonymizerTest.java`:
- Around line 250-255: Extend PPLQueryDataAnonymizerTest (keeping
testTransposeCommand or adding new tests) to cover null, boundary and error
cases for the anonymize(String) behavior: add an assertion that anonymize(null)
returns null (assertNull), anonymize("") returns "" (assertEquals), tests for
boundary numeric counts like "transpose 0 ..." and a large count to ensure count
handling remains consistent with existing masking behavior, a case where
column_name is omitted (e.g., "source=t | transpose 5") to verify no NPE and
expected output, and a malformed transpose string (e.g., "source=t | transpose
abc column_name='c'") asserted with assertThrows or the expected anonymized
output per the anonymize contract; reference PPLQueryDataAnonymizerTest and the
anonymize(...) call when adding these assertions.
🧹 Nitpick comments (6)
.rules/REVIEW_GUIDELINES.md (1)

122-123: Capitalize "Markdown" as a proper noun.

Per static analysis, "markdown" should be capitalized when referring to the Markdown language.

📝 Suggested fix
-- Missing language identifiers in markdown code blocks (unless in documentation files)
+- Missing language identifiers in Markdown code blocks (unless in documentation files)
ppl/src/test/java/org/opensearch/sql/ppl/parser/FieldResolutionVisitorTest.java (1)

432-505: Strengthen helpers to fail on unexpected or multiple relation/SPATH results.
getSingleRelationResult/getSingleSpathResult return the first match and ignore extras; assertMultiRelationFields ignores unexpected relations. This can let regressions slip through. Consider asserting uniqueness and unexpected relations explicitly.

♻️ Suggested tightening
   private FieldResolutionResult getRelationResult(
       Map<UnresolvedPlan, FieldResolutionResult> results) {
-    for (UnresolvedPlan key : results.keySet()) {
-      if (key instanceof Relation) {
-        return results.get(key);
-      }
-    }
-    fail("Relation result not found");
-    return null;
+    FieldResolutionResult found = null;
+    for (Map.Entry<UnresolvedPlan, FieldResolutionResult> entry : results.entrySet()) {
+      if (entry.getKey() instanceof Relation) {
+        if (found != null) {
+          fail("Expected a single relation result but found multiple");
+        }
+        found = entry.getValue();
+      }
+    }
+    if (found == null) {
+      fail("Relation result not found");
+    }
+    return found;
   }
 
   private FieldResolutionResult getSpathResult(Map<UnresolvedPlan, FieldResolutionResult> results) {
-    for (UnresolvedPlan key : results.keySet()) {
-      if (key instanceof SPath) {
-        return results.get(key);
-      }
-    }
-    fail("Spath result not found");
-    return null;
+    FieldResolutionResult found = null;
+    for (Map.Entry<UnresolvedPlan, FieldResolutionResult> entry : results.entrySet()) {
+      if (entry.getKey() instanceof SPath) {
+        if (found != null) {
+          fail("Expected a single spath result but found multiple");
+        }
+        found = entry.getValue();
+      }
+    }
+    if (found == null) {
+      fail("Spath result not found");
+    }
+    return found;
   }
 
   private void assertMultiRelationFields(
       String query, Map<String, FieldResolutionResult> expectedResultsByTable) {
     UnresolvedPlan plan = parse(query);
     Map<UnresolvedPlan, FieldResolutionResult> results = visitor.analyze(plan);
 
     Set<String> foundTables = new HashSet<>();
     for (Map.Entry<UnresolvedPlan, FieldResolutionResult> entry : results.entrySet()) {
       if (!(entry.getKey() instanceof Relation)) {
         continue;
       }
       String tableName = ((Relation) entry.getKey()).getTableQualifiedName().toString();
       FieldResolutionResult expectedResult = expectedResultsByTable.get(tableName);
 
-      if (expectedResult != null) {
-        assertEquals(expectedResult.getRegularFields(), entry.getValue().getRegularFields());
-        assertEquals(expectedResult.getWildcard(), entry.getValue().getWildcard());
-        foundTables.add(tableName);
-      }
+      if (expectedResult == null) {
+        fail("Unexpected relation result for table: " + tableName);
+      }
+      assertEquals(expectedResult.getRegularFields(), entry.getValue().getRegularFields());
+      assertEquals(expectedResult.getWildcard(), entry.getValue().getWildcard());
+      foundTables.add(tableName);
     }
 
     assertEquals(expectedResultsByTable.size(), foundTables.size());
   }
docs/user/ppl/cmd/spath.md (1)

36-47: Rephrase repeated “Limitation” bullets; hyphenate auto‑extraction.
LanguageTool flagged repeated sentence starts; a small rewording improves readability and fixes the compound adjective.

✍️ Suggested wording tweak
-* **Limitation**: All extracted fields are returned as STRING type.
-* **Limitation**: Field order in the result could be inconsistent with query without `spath` command, and the behavior might change in the future version.
-* **Limitation**: Filter with subquery (`where <field> in/exists [...]`) is not supported with `spath` command.
-* **Limitation**: `fillnull` command requires to specify fields when used with `spath` command.
-* **Limitation**: Following commands cannot be used together with `spath` command: `lookup`.
+* **Limitation**: All extracted fields are returned as STRING type.
+* **Constraint**: Field order in the result could be inconsistent with query without `spath` command, and the behavior might change in the future version.
+* **Restriction**: Filter with subquery (`where <field> in/exists [...]`) is not supported with `spath` command.
+* **Compatibility**: `fillnull` command requires to specify fields when used with `spath` command.
+* **Unsupported**: Following commands cannot be used together with `spath` command: `lookup`.
 
-* **Internal Implementation**: The auto extraction feature uses an internal `_MAP` system column to store dynamic fields during query processing. This column is automatically expanded into individual columns in the final results and users don't need to reference it directly. For more information, see [System Columns](../general/identifiers.md#system-columns).
+* **Internal Implementation**: The auto-extraction feature uses an internal `_MAP` system column to store dynamic fields during query processing. This column is automatically expanded into individual columns in the final results and users don't need to reference it directly. For more information, see [System Columns](../general/identifiers.md#system-columns).
ppl/src/test/java/org/opensearch/sql/ppl/calcite/CalcitePPLTransposeTest.java (1)

120-143: Consider removing the commented-out SQL block.
It adds noise without test value.

integ-test/src/test/java/org/opensearch/sql/calcite/remote/CalcitePPLSpathTestBase.java (1)

55-57: Consider escaping backslashes in addition to double quotes.

The escape() method only escapes double quotes, but JSON strings containing backslashes (e.g., file paths, regex patterns) would also need escaping when embedded in a string field.

Suggested enhancement
   protected String escape(String json) {
-    return json.replace("\"", "\\\"");
+    return json.replace("\\", "\\\\").replace("\"", "\\\"");
   }
core/src/main/java/org/opensearch/sql/calcite/CalciteRelNodeVisitor.java (1)

704-772: Consider extracting visitTranspose steps into helpers.

The method is >50 lines and already organized into multiple steps (row_number, unpivot, trim, pivot, rename). Splitting into small private helpers would improve readability and maintainability.

As per coding guidelines, ...

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In
`@integ-test/src/test/java/org/opensearch/sql/security/CrossClusterSearchIT.java`:
- Around line 290-324: There are two identical test methods named
testCrossClusterTranspose() causing a duplicate-symbol compile error; remove the
duplicate method (keep a single testCrossClusterTranspose) and update the
misleading comment to accurately describe the test (e.g., mention testing
transpose on a remote cluster using TEST_INDEX_BANK_REMOTE). Ensure the retained
method still calls executeQuery with the formatted source TEST_INDEX_BANK_REMOTE
and uses verifyDataRows/rows as before.

In `@ppl/src/main/antlr/OpenSearchPPLLexer.g4`:
- Around line 75-76: Remove the duplicate FIELDFORMAT token declaration in the
lexer: keep the new COLUMN_NAME token but delete the second FIELDFORMAT line so
only the original FIELDFORMAT token remains; ensure there is no other duplicate
token name in OpenSearchPPLLexer.g4 to prevent ANTLR warnings.

In `@ppl/src/main/antlr/OpenSearchPPLParser.g4`:
- Line 110: In the commandName parser rule there are two occurrences of the
token FIELDFORMAT; remove the duplicate occurrence so FIELDFORMAT appears only
once in commandName (keep the other entry intact), updating the surrounding
alternation/commas if needed to keep the grammar valid; refer to the commandName
rule and the FIELDFORMAT token when making the change.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@docs/user/ppl/cmd/fieldformat.md`:
- Around line 117-127: The markdown code block in
docs/user/ppl/cmd/fieldformat.md is missing its closing triple backticks; please
add a closing ``` after the example table (after the fetched rows / table block
shown) so the fenced code block is properly terminated and renders correctly.

Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
…oject#4786 (opensearch-project#5011)

* transpose command implementation

Signed-off-by: Asif Bashar <[email protected]>

* transpose rows to columns

Signed-off-by: Asif Bashar <[email protected]>

* added argument type missing map and hashmap

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added more validations

Signed-off-by: Asif Bashar <[email protected]>

* added validation

Signed-off-by: Asif Bashar <[email protected]>

* index.md formatting fix

Signed-off-by: Asif Bashar <[email protected]>

* doc format

Signed-off-by: Asif Bashar <[email protected]>

* coderabbit review fixes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* trim columnName

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* added field resolution

Signed-off-by: Asif Bashar <[email protected]>

* fix by removing metadata field

Signed-off-by: Asif Bashar <[email protected]>

* fixed explain test after removing of metadata fields in transpose result

Signed-off-by: Asif Bashar <[email protected]>

---------

Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
…oject#4786 (opensearch-project#5011)

* transpose command implementation

Signed-off-by: Asif Bashar <[email protected]>

* transpose rows to columns

Signed-off-by: Asif Bashar <[email protected]>

* added argument type missing map and hashmap

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added more validations

Signed-off-by: Asif Bashar <[email protected]>

* added validation

Signed-off-by: Asif Bashar <[email protected]>

* index.md formatting fix

Signed-off-by: Asif Bashar <[email protected]>

* doc format

Signed-off-by: Asif Bashar <[email protected]>

* coderabbit review fixes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* trim columnName

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* added field resolution

Signed-off-by: Asif Bashar <[email protected]>

* fix by removing metadata field

Signed-off-by: Asif Bashar <[email protected]>

* fixed explain test after removing of metadata fields in transpose result

Signed-off-by: Asif Bashar <[email protected]>

---------

Signed-off-by: Asif Bashar <[email protected]>
# Conflicts:
#	integ-test/src/test/java/org/opensearch/sql/ppl/NewAddedCommandsIT.java
#	ppl/src/main/antlr/OpenSearchPPLLexer.g4
Signed-off-by: Asif Bashar <[email protected]>

# Conflicts:
#	ppl/src/main/antlr/OpenSearchPPLLexer.g4
#	ppl/src/main/antlr/OpenSearchPPLParser.g4
Signed-off-by: Asif Bashar <[email protected]>
…oject#4786 (opensearch-project#5011)

* transpose command implementation

Signed-off-by: Asif Bashar <[email protected]>

* transpose rows to columns

Signed-off-by: Asif Bashar <[email protected]>

* added argument type missing map and hashmap

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added tests

Signed-off-by: Asif Bashar <[email protected]>

* added more validations

Signed-off-by: Asif Bashar <[email protected]>

* added validation

Signed-off-by: Asif Bashar <[email protected]>

* index.md formatting fix

Signed-off-by: Asif Bashar <[email protected]>

* doc format

Signed-off-by: Asif Bashar <[email protected]>

* coderabbit review fixes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* added recommended changes

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* for cross cluster failure debugging

Signed-off-by: Asif Bashar <[email protected]>

* trim columnName

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* per review moved to class varialble.

Signed-off-by: Asif Bashar <[email protected]>

* added field resolution

Signed-off-by: Asif Bashar <[email protected]>

* fix by removing metadata field

Signed-off-by: Asif Bashar <[email protected]>

* fixed explain test after removing of metadata fields in transpose result

Signed-off-by: Asif Bashar <[email protected]>

---------

Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
@asifabashar asifabashar force-pushed the feature-fieldformat branch 2 times, most recently from d04c315 to 78d8956 Compare January 28, 2026 18:18
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
core/src/main/java/org/opensearch/sql/ast/expression/Let.java (1)

23-49: Add JavaDoc for the new public constructor.

The new Let(Field, UnresolvedExpression, Literal, Literal) constructor is public but lacks JavaDoc. Please add @param (and @throws if applicable) to comply with core module standards. As per coding guidelines, Public methods MUST have JavaDoc with @param, @return, and @throws.

🤖 Fix all issues with AI agents
In
`@integ-test/src/test/java/org/opensearch/sql/calcite/remote/CalciteFieldFormatCommandIT.java`:
- Line 12: The test class CalciteFieldFormatCommandIT currently imports
org.junit.jupiter.api.Test but extends the JUnit4-style PPLIntegTestCase;
replace the JUnit Jupiter import with the JUnit4 annotation import
(org.junit.Test) so the test runner and `@Rule` handling in PPLIntegTestCase are
honored—update the import statement in CalciteFieldFormatCommandIT to reference
org.junit.Test and ensure any `@Test` annotations use that type.
🧹 Nitpick comments (2)
ppl/src/test/java/org/opensearch/sql/ppl/calcite/CalcitePPLFieldFormatTest.java (1)

192-213: Consider adding explicit NULL input handling test.

The test testFieldFormatWithToNumber shows total=null in expected results when COMM is null, which implicitly tests NULL propagation. However, per coding guidelines, NULL input tests should be explicitly included for new functions. Consider adding a dedicated test that explicitly validates fieldformat behavior when source fields contain NULL values.

core/src/main/java/org/opensearch/sql/calcite/CalciteRexNodeVisitor.java (1)

324-334: Minor: Remove extraneous blank lines inside conditionals.

The blank lines at lines 325 and 336 inside the if blocks are unnecessary and reduce readability.

🧹 Suggested cleanup
     if (node.getConcatPrefix() != null) {
-
       expr =
           context.rexBuilder.makeCall(
               SqlStdOperatorTable.CONCAT,

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@ppl/src/main/java/org/opensearch/sql/ppl/parser/AstExpressionBuilder.java`:
- Around line 150-153: The default branch of the switch in AstExpressionBuilder
incorrectly logs ctx.getClass() — update the thrown IllegalArgumentException to
reference the actual switched variable ffLogicalExpressionCtx (and print its
class when non-null), and ensure the message clearly handles null (e.g.,
"Unknown ffLogicalExpression context type: null" or include
ffLogicalExpressionCtx.getClass() only when ffLogicalExpressionCtx != null) so
the exception accurately describes the offending value.
🧹 Nitpick comments (2)
ppl/src/test/java/org/opensearch/sql/ppl/calcite/CalcitePPLFieldFormatTest.java (1)

16-21: Consider adding NULL handling tests for fieldformat concatenation.

Per coding guidelines, NULL input tests should be included for new functionality. The fieldformat command with prefix/suffix concatenation should be tested when the underlying expression evaluates to NULL (e.g., when COMM is NULL in the EMP table).

Example test case to add:

`@Test`
public void testFieldFormatWithNullValue() {
    // COMM is NULL for EMPNO=7369
    String ppl = "source=EMP | sort EMPNO | head 1 | fieldformat formatted = \"$\".tostring(COMM, \"commas\")";
    RelNode root = getRelNode(ppl);
    // Verify NULL propagation behavior
}

As per coding guidelines: "NULL input tests must be included for all new functions".

integ-test/src/test/java/org/opensearch/sql/ppl/PPLIntegTestCase.java (1)

17-17: Consider adding null guard for defensive programming, though query parameter is declared non-nullable.

The query parameter is typed as String (not nullable), and all call sites pass non-null String values, so immediate risk is low. However, StringEscapeUtils.escapeJson() will throw NullPointerException if passed null. The commons-text version (1.11.0) is confirmed in the build configuration and includes this method. If additional defensive safety is desired, consider the optional refactor below:

♻️ Optional null-safe refactor
-            StringEscapeUtils.escapeJson(query)));
+            StringEscapeUtils.escapeJson(String.valueOf(query))));

Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
Signed-off-by: Asif Bashar <[email protected]>
@yuancu yuancu added the feature label Jan 29, 2026
The `fieldformat` command has the following syntax:

```syntax
Following are possible syntaxes:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The syntax should be accurate and concise; the word "possbile" is misleading. Remove this sentence.

result,
rows("Alice", 25, "Age: 25"),
rows("Bob", 30, "Age: 30"),
rows("Charlie", null, null));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Want to confirm tostring align with concat behaviour, PPL concat("a", null), return null value?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[RFC] fieldformat command implementation as alias to eval as in PPL roadmap [#4287]

3 participants