Skip to content

feat: Add TEXTFILE custom serde parameters support#27167

Merged
aditi-pandit merged 1 commit intoprestodb:masterfrom
xin-zhang2:textfile
Feb 23, 2026
Merged

feat: Add TEXTFILE custom serde parameters support#27167
aditi-pandit merged 1 commit intoprestodb:masterfrom
xin-zhang2:textfile

Conversation

@xin-zhang2
Copy link
Copy Markdown
Contributor

@xin-zhang2 xin-zhang2 commented Feb 19, 2026

Description

Add serde parameters support for Textfile format for both Presto and Prestissimo. The parameters includes:

  • textfile_field_delim
  • textfile_collection_delim
  • textfile_mapkey_delim
  • textfile_escape_delim

Motivation and Context

Impact

Test Plan

Contributor checklist

  • Please make sure your submission complies with our contributing guide, in particular code style and commit standards.
  • PR description addresses the issue accurately and concisely. If the change is non-trivial, a GitHub Issue is referenced.
  • Documented new properties (with its default value), SQL syntax, functions, or other functionality.
  • If release notes are required, they follow the release notes guidelines.
  • Adequate tests were added if applicable.
  • CI passed.
  • If adding new dependencies, verified they have an OpenSSF Scorecard score of 5.0 or higher (or obtained explicit TSC approval for lower scores).

Release Notes

Please follow release notes guidelines and fill in the release notes below.

== RELEASE NOTES ==

Hive Connector Changes
* Add support for custom TEXTFILE SerDe parameters ``textfile_field_delim``, ``textfile_escape_delim``, ``textfile_collection_delim``, and ``textfile_mapkey_delim``.

Summary by Sourcery

Add Hive TEXTFILE SerDe table properties and wire them through Hive metadata so custom delimiters can be used for TEXTFILE tables, with coverage in both Hive and native execution tests.

New Features:

  • Introduce Hive table properties to configure TEXTFILE field, collection, map key, and escape delimiters.
  • Support propagating TEXTFILE SerDe delimiter settings between Presto and the Hive metastore when creating TEXTFILE tables.

Enhancements:

  • Refactor native TEXTFILE table creation and insertion SQL into reusable helpers to simplify tests.
  • Generalize single-character table property validation for reuse across CSV and TEXTFILE properties.

Tests:

  • Add Hive integration tests validating read and write behavior of TEXTFILE tables using custom SerDe delimiters.
  • Extend native execution tests to verify reading TEXTFILE tables with and without custom SerDe parameters.

@prestodb-ci prestodb-ci added the from:IBM PR from IBM label Feb 19, 2026
@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai bot commented Feb 19, 2026

Reviewer's Guide

Adds Hive TEXTFILE SerDe table properties for field, escape, collection, and map-key delimiters; wires them through HiveMetadata as SerDe parameters instead of regular table parameters; and adds native and Hive integration tests (read/write) that exercise custom TEXTFILE delimiters, including some refactoring of existing TEXTFILE tests.

Sequence diagram for creating a TEXTFILE table with custom SerDe delimiters

sequenceDiagram
    actor User
    participant PrestoClient
    participant HiveMetadata
    participant HiveMetastore

    User->>PrestoClient: CREATE TABLE ... WITH (textfile_field_delim = ',')
    PrestoClient->>HiveMetadata: createTable(SchemaTableName, ConnectorTableMetadata)

    HiveMetadata->>HiveMetadata: getEmptyTableProperties(hiveStorageFormat, tableMetadata, tableProperties)
    HiveMetadata->>HiveMetadata: getSingleCharacterProperty(tableProperties, TEXTFILE_FIELD_DELIM)
    HiveMetadata->>HiveMetadata: tableProperties.put(TEXTFILE_FIELD_DELIM_KEY, value)

    HiveMetadata->>HiveMetadata: extractSerdeParameters(additionalTableParameters)
    HiveMetadata->>HiveMetadata: filter out TEXTFILE_SERDE_KEYS from additionalTableParameters

    HiveMetadata->>HiveMetastore: createTable(Table with parameters without TEXTFILE_SERDE_KEYS, serdeParameters containing TEXTFILE_SERDE_KEYS)

    HiveMetastore-->>HiveMetadata: createTable result
    HiveMetadata-->>PrestoClient: table created
    PrestoClient-->>User: success
Loading

Sequence diagram for reading TEXTFILE SerDe delimiters into table properties

sequenceDiagram
    participant PrestoClient
    participant HiveMetadata
    participant HiveMetastore

    PrestoClient->>HiveMetadata: getTableMetadata(SchemaTableName)
    HiveMetadata->>HiveMetastore: getTable(SchemaTableName)
    HiveMetastore-->>HiveMetadata: Table with serdeParameters

    HiveMetadata->>HiveMetadata: getSerdeProperty(table, TEXTFILE_FIELD_DELIM_KEY)
    HiveMetadata->>HiveMetadata: properties.put(TEXTFILE_FIELD_DELIM, value)

    HiveMetadata->>HiveMetadata: getSerdeProperty(table, TEXTFILE_ESCAPE_DELIM_KEY)
    HiveMetadata->>HiveMetadata: properties.put(TEXTFILE_ESCAPE_DELIM, value)

    HiveMetadata->>HiveMetadata: getSerdeProperty(table, TEXTFILE_COLLECTION_DELIM_KEY)
    HiveMetadata->>HiveMetadata: properties.put(TEXTFILE_COLLECTION_DELIM, value)

    HiveMetadata->>HiveMetadata: getSerdeProperty(table, TEXTFILE_MAPKEY_DELIM_KEY)
    HiveMetadata->>HiveMetadata: properties.put(TEXTFILE_MAPKEY_DELIM, value)

    HiveMetadata-->>PrestoClient: ConnectorTableMetadata with TEXTFILE properties
Loading

Updated class diagram for HiveMetadata and HiveTableProperties TEXTFILE SerDe support

classDiagram
    class HiveTableProperties {
        <<static>>
        +String CSV_SEPARATOR
        +String CSV_QUOTE
        +String CSV_ESCAPE
        +String TEXTFILE_FIELD_DELIM
        +String TEXTFILE_MAPKEY_DELIM
        +String TEXTFILE_COLLECTION_DELIM
        +String TEXTFILE_ESCAPE_DELIM
        +HiveTableProperties(TypeManager typeManager, HiveClientConfig config)
        +static Optional~Character~ getSingleCharacterProperty(Map~String,Object~ tableProperties, String key)
    }

    class HiveMetadata {
        <<static>>
        +String CSV_QUOTE_KEY
        +String CSV_ESCAPE_KEY
        +String TEXTFILE_FIELD_DELIM_KEY
        +String TEXTFILE_ESCAPE_DELIM_KEY
        +String TEXTFILE_COLLECTION_DELIM_KEY
        +String TEXTFILE_MAPKEY_DELIM_KEY
        +Set~String~ TEXTFILE_SERDE_KEYS
        +static Map~String,String~ extractSerdeParameters(Map~String,String~ tableParameters)
    }

    HiveMetadata ..> HiveTableProperties : uses TEXTFILE_* constants
    HiveMetadata ..> HiveTableProperties : uses getSingleCharacterProperty
Loading

File-Level Changes

Change Details Files
Introduce Hive table properties for TEXTFILE SerDe delimiters and plumb them through HiveMetadata as SerDe parameters, distinct from general table parameters.
  • Add TEXTFILE_* delimiter property names to HiveTableProperties and expose them as optional string properties validated as single characters via a generalized getSingleCharacterProperty helper (renamed from getCsvProperty).
  • Define corresponding TEXTFILE_* SerDe keys and TEXTFILE_SERDE_KEYS set in HiveMetadata and include them when reading table metadata into connector properties.
  • When creating/updating tables, extract TEXTFILE SerDe parameters from table properties using extractSerdeParameters, remove them from the general table parameters map, and pass them separately via Table.Builder.setSerdeParameters().
  • Extend getEmptyTableProperties to accept the new TEXTFILE_* properties only for TEXTFILE format, mapping them to the Hive SerDe parameter keys.
presto-hive/src/main/java/com/facebook/presto/hive/HiveTableProperties.java
presto-hive/src/main/java/com/facebook/presto/hive/HiveMetadata.java
Add integration tests for reading and writing TEXTFILE data with custom SerDe delimiters in the Hive connector.
  • Add testSerdeParametersForTextfileRead that creates an external TEXTFILE table over a handcrafted file using custom field, collection, map-key, and escape delimiters, then asserts correct decoding of primitives, arrays, maps, and nested row fields.
  • Add testSerdeParametersForTextfileWrite that creates a TEXTFILE table with custom delimiters, inserts complex data, and then reads it back to validate round-trip correctness.
presto-hive/src/test/java/com/facebook/presto/hive/TestHiveIntegrationSmokeTest.java
Refactor and extend native execution Textfile tests to reuse DDL/DML builders and validate behavior with custom TEXTFILE SerDe parameters.
  • Extract helper methods createTextFileTableSql and insertTextFileTableSql to centralize the TEXTFILE test table DDL and insert statement used by native worker tests.
  • Modify existing testReadTableWithTextfileFormat to use the new helpers, keeping behavior with default TEXTFILE delimiters.
  • Add testReadTableWithCustomSerdeTextfile that creates a TEXTFILE table using custom textfile_field_delim, textfile_escape_delim, textfile_collection_delim, and textfile_mapkey_delim options, then asserts query results to validate native engine compatibility.
presto-native-execution/src/test/java/com/facebook/presto/nativeworker/AbstractTestNativeGeneralQueries.java

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@xin-zhang2 xin-zhang2 changed the title feat(TextReader): Add custom serde parameters support feat: Add TEXTFILE custom serde parameters support Feb 19, 2026
@xin-zhang2 xin-zhang2 marked this pull request as ready for review February 19, 2026 15:08
@xin-zhang2 xin-zhang2 requested review from a team, elharo and steveburnett as code owners February 19, 2026 15:08
@prestodb-ci prestodb-ci requested review from a team, ShahimSharafudeen and pratyakshsharma and removed request for a team February 19, 2026 15:08
Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 4 issues, and left some high level feedback:

  • In HiveMetadata.getTableMetadata() the TEXTFILE serde values are put into the properties map using the serde keys (e.g., field.delim via TEXTFILE_FIELD_DELIM_KEY) rather than the connector property keys defined in HiveTableProperties (e.g., textfile_field_delim), which will prevent these options from round‑tripping correctly; consider mapping serde keys back to the HiveTableProperties.TEXTFILE_ names instead.
  • In HiveMetadata.buildTableObject() you compute serdeParameters = extractSerdeParameters(additionalTableParameters) but then call extractSerdeParameters again when setting .setSerdeParameters(...); you can reuse the already computed serdeParameters to avoid duplicate work and keep the parameters consistent.
  • The serdeParams string in testReadTableWithCustomSerdeTextfile manually embeds a trailing comma and space which are coupled to the SQL format string in createTextFileTableSql; consider having createTextFileTableSql handle comma/spacing and passing in only the key–value pairs to make the API less error‑prone.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In HiveMetadata.getTableMetadata() the TEXTFILE serde values are put into the properties map using the serde keys (e.g., `field.delim` via TEXTFILE_FIELD_DELIM_KEY) rather than the connector property keys defined in HiveTableProperties (e.g., `textfile_field_delim`), which will prevent these options from round‑tripping correctly; consider mapping serde keys back to the HiveTableProperties.*TEXTFILE_* names instead.
- In HiveMetadata.buildTableObject() you compute `serdeParameters = extractSerdeParameters(additionalTableParameters)` but then call `extractSerdeParameters` again when setting `.setSerdeParameters(...)`; you can reuse the already computed `serdeParameters` to avoid duplicate work and keep the parameters consistent.
- The `serdeParams` string in `testReadTableWithCustomSerdeTextfile` manually embeds a trailing comma and space which are coupled to the SQL format string in `createTextFileTableSql`; consider having `createTextFileTableSql` handle comma/spacing and passing in only the key–value pairs to make the API less error‑prone.

## Individual Comments

### Comment 1
<location> `presto-hive/src/main/java/com/facebook/presto/hive/HiveMetadata.java:786-794` </location>
<code_context>
         getSerdeProperty(table.get(), SKIP_FOOTER_COUNT_KEY)
                         .ifPresent(skipFooterCount -> properties.put(SKIP_FOOTER_LINE_COUNT, Integer.valueOf(skipFooterCount)));

+        // Textfile specific properties
+        getSerdeProperty(table.get(), TEXTFILE_FIELD_DELIM_KEY)
+                .ifPresent(fieldDelim -> properties.put(TEXTFILE_FIELD_DELIM_KEY, fieldDelim));
+        getSerdeProperty(table.get(), TEXTFILE_ESCAPE_DELIM_KEY)
+                        .ifPresent(escapeDelim -> properties.put(TEXTFILE_ESCAPE_DELIM_KEY, escapeDelim));
+        getSerdeProperty(table.get(), TEXTFILE_COLLECTION_DELIM_KEY)
+                        .ifPresent(textCollectionDelim -> properties.put(TEXTFILE_COLLECTION_DELIM_KEY, textCollectionDelim));
+        getSerdeProperty(table.get(), TEXTFILE_MAPKEY_DELIM_KEY)
+                        .ifPresent(textMapKeyDelim -> properties.put(TEXTFILE_MAPKEY_DELIM_KEY, textMapKeyDelim));
+
         // CSV specific property
</code_context>

<issue_to_address>
**issue (bug_risk):** Textfile properties in table metadata use serde keys instead of Presto table property keys.

For CSV and header/footer, `properties` uses the Presto table property keys (e.g., `CSV_SEPARATOR`, `SKIP_HEADER_LINE_COUNT`). The new Textfile logic instead stores values under the serde keys (e.g., `TEXTFILE_FIELD_DELIM_KEY` / `"field.delim"`) rather than the corresponding table property keys (e.g., `TEXTFILE_FIELD_DELIM` / `"textfile_field_delim"`). As a result, `HiveTableProperties.getSingleCharacterProperty` (which looks up `TEXTFILE_FIELD_DELIM`, etc.) will never see these values. Please switch the `properties.put(...)` calls to use the `HiveTableProperties.TEXTFILE_*` table property constants to keep the round‑trip consistent with the CSV handling above.
</issue_to_address>

### Comment 2
<location> `presto-hive/src/main/java/com/facebook/presto/hive/HiveMetadata.java:1512-1521` </location>
<code_context>
             }
         }

+        Map<String, String> serdeParameters = extractSerdeParameters(additionalTableParameters);
+
         ImmutableMap.Builder<String, String> tableParameters = ImmutableMap.<String, String>builder()
</code_context>

<issue_to_address>
**suggestion:** Avoid recomputing serdeParameters by reusing the local variable in buildTableObject.

`serdeParameters` is already derived from `additionalTableParameters` and used to filter `tableParameters`, but `setSerdeParameters` recomputes it with `extractSerdeParameters(additionalTableParameters)`. To keep the logic single‑sourced and avoid overhead or divergence, pass the existing `serdeParameters` to `.setSerdeParameters(serdeParameters)` instead.

Suggested implementation:

```java
        Map<String, String> serdeParameters = extractSerdeParameters(additionalTableParameters);

        ImmutableMap.Builder<String, String> tableParameters = ImmutableMap.<String, String>builder()
                .put(PRESTO_VERSION_NAME, prestoVersion)
                .put(PRESTO_QUERY_ID_NAME, queryId)
                .putAll(additionalTableParameters.entrySet().stream()
                        .filter(entry -> !serdeParameters.containsKey(entry.getKey()))
                        .collect(toImmutableMap(Map.Entry::getKey, Map.Entry::getValue)));

        if (tableType.equals(EXTERNAL_TABLE)) {
            tableParameters.put("EXTERNAL", "TRUE");
                .setStorageFormat(fromHiveStorageFormat(hiveStorageFormat))

```

```java
                .setTableType(tableType)
                .setDataColumns(columns)
                .setPartitionColumns(partitionColumns.build())
                .setParameters(tableParameters.build())
                .setSerdeParameters(serdeParameters)

```

If there are any other locations in this file where `extractSerdeParameters(additionalTableParameters)` is called purely for passing into `setSerdeParameters(...)` within the same method where `serdeParameters` is already computed, those should similarly be replaced with `serdeParameters` to keep the logic single-sourced.
</issue_to_address>

### Comment 3
<location> `presto-hive/src/test/java/com/facebook/presto/hive/TestHiveIntegrationSmokeTest.java:7131-7145` </location>
<code_context>
+                        ")",
+                catalog, schema, table, new Path(tempDir.toURI().toASCIIString()));
+        try {
+            assertUpdate(createTableSql);
+
+            assertQuery(
+                    format(
+                            "SELECT\n" +
</code_context>

<issue_to_address>
**suggestion (testing):** Consider asserting that TEXTFILE serde properties are actually persisted as table properties/serde parameters

The existing read/write tests show that custom delimiters are honored at runtime. To fully validate the metadata aspect of this change, add an assertion that the configured TEXTFILE properties appear in table metadata (for example via `SHOW CREATE TABLE` or `information_schema.table_properties`, consistent with the rest of the suite). This confirms the serde parameters persist correctly in Hive metadata, not just during parsing/serialization.

```suggestion
        try {
            assertUpdate(createTableSql);

            // Verify TEXTFILE serde properties are persisted as table properties in Hive metadata
            assertQuery(
                    format(
                            "SELECT property_name, property_value " +
                                    "FROM %s.information_schema.table_properties " +
                                    "WHERE table_schema = '%s' " +
                                    "  AND table_name = '%s' " +
                                    "  AND property_name IN (" +
                                    "      'textfile_field_delim', " +
                                    "      'textfile_collection_delim', " +
                                    "      'textfile_mapkey_delim', " +
                                    "      'textfile_escape_delim') " +
                                    "ORDER BY property_name",
                            catalog, schema, table),
                    "VALUES " +
                            "('textfile_collection_delim', ';'), " +
                            "('textfile_escape_delim', '\u0001'), " +
                            "('textfile_field_delim', '|'), " +
                            "('textfile_mapkey_delim', ':')");

            assertQuery(
                    format(
                            "SELECT\n" +
                                    "c1, c2, c3, c4, c5, \n" +
                                    "element_at(c6, 'size'), element_at(c6, 'color'), \n" +
                                    "c7.s_arr, element_at(c7.s_map, 10), element_at(c7.s_map, 20) FROM %s.%s.%s", catalog, schema, table),
                    "VALUES(" +
                            "1001, 'he|llo', true, 88.5, \n" +
                            "ARRAY['alpha', 'beta', 'gamma'], \n" +
                            "'large', 'blue', \n" +
                            "ARRAY[CAST(1.1 AS REAL), CAST(2.2 AS REAL), CAST(3.3 AS REAL)], 'foo', 'bar')");
        }
```
</issue_to_address>

### Comment 4
<location> `presto-docs/src/main/sphinx/connector/hive.rst:315-316` </location>
<code_context>
+======================================================== ============================================================================== =============================
+
+.. note::
+Theses properties are mapped to the corresponding properties in Hive ``LazySerDeParameters`` during serialization and
+follow the same behaviors with ``LazySimpleSerDe``.
+If they are not defined, the Hive defaults are used, which are typically ``\001`` for field delimiter, ``\002`` for
</code_context>

<issue_to_address>
**issue (typo):** Fix typo in "Theses properties" to "These properties".

```suggestion
.. note::
These properties are mapped to the corresponding properties in Hive ``LazySerDeParameters`` during serialization and
```
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@xin-zhang2 xin-zhang2 force-pushed the textfile branch 2 times, most recently from b6e74fa to 5d044c1 Compare February 19, 2026 15:51
steveburnett
steveburnett previously approved these changes Feb 19, 2026
Copy link
Copy Markdown
Contributor

@steveburnett steveburnett left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! (docs)

Pull branch, local doc build, looks good. Thanks!

Copy link
Copy Markdown
Contributor

@aditi-pandit aditi-pandit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @xin-zhang2 for this code.

@aditi-pandit
Copy link
Copy Markdown
Contributor

aditi-pandit commented Feb 20, 2026

@tdcmeehan : We need this for the loading phase of the TPC-DS benchmark. Please can you take a look at the Hive changes.

tdcmeehan
tdcmeehan previously approved these changes Feb 21, 2026
@aditi-pandit
Copy link
Copy Markdown
Contributor

@xin-zhang2
Copy link
Copy Markdown
Contributor Author

@aditi-pandit
I have verified locally that the tpc-ds queries run successfully with | as the field delimiter.

TestTextReaderWithTpcdsQueriesUsingThrift might not be the right place to add csv usage. It inherits from AbstractTestNativeTpcdsQueries, which is intended to run tpc-ds queries against the default table properties rather than to test table properties.  Adding this would require modifying all the create table methods in AbstractTestNativeTpcdsQueries, which seems outside the scope of this test.

Also, csv here is essentially treated as a special case of textfile with custom delimiters, and we have tested that with testReadTableWithCustomSerdeTextfile in AbstractTestNativeGeneralQueries, so running the full tpc-ds for it may not add much additional coverage.

@xin-zhang2
Copy link
Copy Markdown
Contributor Author

xin-zhang2 commented Feb 23, 2026

@tdcmeehan @steveburnett
I pushed an update to resolve conflicts and the previous approval was dismissed. Could you take another look? Thanks!

Copy link
Copy Markdown
Contributor

@steveburnett steveburnett left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! (docs)

Pull updated branch, new local doc build, looks good. Thanks!

@aditi-pandit
Copy link
Copy Markdown
Contributor

@aditi-pandit I have verified locally that the tpc-ds queries run successfully with | as the field delimiter.

TestTextReaderWithTpcdsQueriesUsingThrift might not be the right place to add csv usage. It inherits from AbstractTestNativeTpcdsQueries, which is intended to run tpc-ds queries against the default table properties rather than to test table properties.  Adding this would require modifying all the create table methods in AbstractTestNativeTpcdsQueries, which seems outside the scope of this test.

Also, csv here is essentially treated as a special case of textfile with custom delimiters, and we have tested that with testReadTableWithCustomSerdeTextfile in AbstractTestNativeGeneralQueries, so running the full tpc-ds for it may not add much additional coverage.

@xin-zhang2 : I was looking for a way to get a easy test case to reproduce if there were issues in that usage. We do expect the folks running the benchmarks to use it often.

Though we can do that as a follow up. Don't want to block this PR.

Copy link
Copy Markdown
Contributor

@aditi-pandit aditi-pandit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @xin-zhang2

@aditi-pandit aditi-pandit merged commit ef8463c into prestodb:master Feb 23, 2026
149 of 153 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

from:IBM PR from IBM

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants