Skip to content

feat(ibis): introduce Databricks connector#1361

Merged
douenergy merged 8 commits intoCanner:mainfrom
goldmedal:feat/databricks-connector
Nov 7, 2025
Merged

feat(ibis): introduce Databricks connector#1361
douenergy merged 8 commits intoCanner:mainfrom
goldmedal:feat/databricks-connector

Conversation

@goldmedal
Copy link
Copy Markdown
Contributor

@goldmedal goldmedal commented Oct 31, 2025

Description

This PR introduces Databricks connector. The connection info would be:

    "connectionInfo": {
        "serverHostname": "https://xxxxxxcloud.databricks.com",
        "httpPath": "/sql/1.0/warehouses/4xxxxxx",
        "accessToken": "xxxxxxxxxx"
    }

For more authentication information, see the Databricks SQL Connector for Python documentation.

Something worth knowing

  • The metadata API will scan all the tables that can be accessed by the connection. No specific catalog or schema.
  • ibis-server/tools/update_databricks_functions.py is a tool for upgrading the function description from Databricks.

Summary by CodeRabbit

  • New Features

    • Added Databricks support: connection configuration, metadata discovery, type normalization, and query execution.
  • Tests

    • Added comprehensive Databricks test suite covering metadata, constraints, functions, queries (including dry-run and error cases).
  • Chores

    • Added Databricks tooling and a utility to refresh Databricks function descriptions; updated project extras/dependencies and CI test exclusions.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Oct 31, 2025

Warning

Rate limit exceeded

@goldmedal has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 8 minutes and 13 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between aa29a88 and 4b61a9b.

📒 Files selected for processing (1)
  • .github/workflows/ibis-ci.yml (1 hunks)

Walkthrough

Adds Databricks support across Python models, data source routing, metadata provider, tests, tooling, and Rust manifest mapping to enable Databricks connections, metadata extraction, query routing, and function description updates.

Changes

Cohort / File(s) Summary
Connection Models
ibis-server/app/model/__init__.py
Added DatabricksConnectionInfo and QueryDatabricksDTO; extended ConnectionInfo union to include DatabricksConnectionInfo.
Data Source Routing
ibis-server/app/model/data_source.py
Added DataSource.databricks; wired QueryDatabricksDTO and DatabricksConnectionInfo into _build_connection_info() and DataSourceExtension.get_connection(); added get_databricks_connection() to create an ibis Databricks backend.
Metadata Provider
ibis-server/app/model/metadata/databricks.py
New DatabricksMetadata class and DATABRICKS_TYPE_MAPPING; implements get_table_list(), get_constraints(), get_version() and helpers for type transformation and constraint/table name formatting.
Metadata Factory
ibis-server/app/model/metadata/factory.py
Registered DataSource.databricksDatabricksMetadata in metadata factory mapping.
Dependencies & Config
ibis-server/pyproject.toml
Added databricks extra to ibis extras, added databricks-sql-connector ^4.0.1[pyarrow], and a pytest marker databricks.
Tests — Fixtures
ibis-server/tests/routers/v3/connector/databricks/conftest.py
New conftest with connection_info fixture (from env) and init_databricks module-scoped autouse fixture creating schema/tables; pytest collection hook applies marker.
Tests — Query & Metadata
ibis-server/tests/routers/v3/connector/databricks/test_query.py, ibis-server/tests/routers/v3/connector/databricks/test_metadata.py
Added tests for query execution (including dry-run/validation/limits) and metadata endpoints (tables, constraints, version).
Tests — Functions
ibis-server/tests/routers/v3/connector/databricks/test_function.py
Added function-listing and function-execution tests (scalar and aggregate) with manifest fixtures.
Tooling
ibis-server/tools/update_databricks_functions.py
New CLI script to fetch and update function descriptions from Databricks DESCRIBE FUNCTION EXTENDED into a CSV; includes parsing, placeholder detection, and options for overwrite/limit/sleep.
Rust — Manifest & Mapping
wren-core-base/manifest-macro/src/lib.rs, wren-core-base/src/mdl/manifest.rs, wren-core/core/src/logical_plan/utils.rs
Added Databricks variant to DataSource enum (serde alias "databricks"); updated Display to emit "DATABRICKS"; extended map_data_type to map "any"DataType::Utf8.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant API as Router/API
    participant DataSrcExt as DataSourceExtension
    participant Ibis as ibis Databricks Backend
    participant Meta as DatabricksMetadata
    participant DB as Databricks SQL

    Client->>API: POST /v2/connector/databricks/metadata/tables\n(connectionInfo)
    API->>DataSrcExt: get_connection(DataSource.databricks, info)
    DataSrcExt->>Ibis: get_databricks_connection(info)
    Ibis-->>DataSrcExt: Backend
    API->>Meta: DatabricksMetadata(connection_info)
    API->>Meta: get_table_list()
    Meta->>DB: SELECT ... FROM INFORMATION_SCHEMA.TABLES/COLUMNS
    DB-->>Meta: rows
    Meta->>Meta: _transform_column_type(...) per column
    Meta-->>API: List[Table]
    API-->>Client: 200 OK with metadata
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

  • Pay special attention to:
    • SQL queries and column/type parsing in DatabricksMetadata (ibis-server/app/model/metadata/databricks.py).
    • _transform_column_type() mappings, decimal/geography/geometry handling and UNKNOWN fallbacks.
    • DataSource routing and get_databricks_connection() integration (ibis-server/app/model/data_source.py).
    • Test fixtures creating live Databricks objects (tests/.../conftest.py) — idempotence and cleanup.
    • CLI parsing and multiline DESCRIBE parsing robustness in tools/update_databricks_functions.py.
    • Rust enum/display changes alignment with Python DataSource values.

Possibly related PRs

Suggested reviewers

  • douenergy
  • onlyjackfrost

Poem

🐰 I hopped through schemas at break of day,
I mapped each column and chased types away.
I stitched a Databricks path so neat,
Ran tests and fixtures, tidy and sweet.
Now CSVs and functions dance — hop, code, hooray! 🥕

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 18.75% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The PR title "feat(ibis): introduce Databricks connector" directly and clearly summarizes the primary objective of the changeset. The changes across multiple files collectively implement support for a Databricks connector, including new DTOs and connection info classes, enum extensions, metadata extraction logic, factory updates, test infrastructure, and utility scripts. The title accurately captures this main theme without being vague or misleading. It is concise, specific enough that a teammate scanning the history would understand the core change, and appropriately avoids unnecessary implementation details.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions github-actions bot added core ibis dependencies Pull requests that update a dependency file rust Pull requests that update Rust code python Pull requests that update Python code labels Oct 31, 2025
@goldmedal goldmedal requested a review from douenergy October 31, 2025 09:20
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
wren-core/core/src/logical_plan/utils.rs (1)

185-185: LGTM – Defensive type mapping follows established pattern.

The mapping of "any" to DataType::Utf8 is consistent with how other unsupported or dynamic types (json, xml, uuid, unknown) are handled in this codebase.

Consider adding a test case for the "any" type mapping in test_map_data_type() (around line 430) for completeness, similar to the existing test cases for other type mappings.

Additionally, you might want to verify whether the "any" type is actually encountered in Databricks metadata discovery, or if this is a preemptive/hypothetical addition. If it's not actively used, you could defer this mapping until it's needed.

// Add to test_map_data_type test cases around line 472:
("any", DataType::Utf8),
ibis-server/tests/routers/v3/connector/databricks/conftest.py (1)

12-16: Consider removing redundant marker application.

The pytest_collection_modifyitems hook adds the databricks marker to tests, but this is already accomplished by the module-level pytestmark = pytest.mark.databricks on line 7. The hook appears unnecessary unless you intend to mark tests in subdirectories.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ae298f4 and 55011b1.

⛔ Files ignored due to path filters (2)
  • ibis-server/poetry.lock is excluded by !**/*.lock
  • ibis-server/resources/function_list/databricks.csv is excluded by !**/*.csv
📒 Files selected for processing (13)
  • ibis-server/app/model/__init__.py (3 hunks)
  • ibis-server/app/model/data_source.py (6 hunks)
  • ibis-server/app/model/metadata/databricks.py (1 hunks)
  • ibis-server/app/model/metadata/factory.py (2 hunks)
  • ibis-server/pyproject.toml (3 hunks)
  • ibis-server/tests/routers/v3/connector/databricks/conftest.py (1 hunks)
  • ibis-server/tests/routers/v3/connector/databricks/test_function.py (1 hunks)
  • ibis-server/tests/routers/v3/connector/databricks/test_metadata.py (1 hunks)
  • ibis-server/tests/routers/v3/connector/databricks/test_query.py (1 hunks)
  • ibis-server/tools/update_databricks_functions.py (1 hunks)
  • wren-core-base/manifest-macro/src/lib.rs (1 hunks)
  • wren-core-base/src/mdl/manifest.rs (1 hunks)
  • wren-core/core/src/logical_plan/utils.rs (1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-29T05:49:45.513Z
Learnt from: goldmedal
Repo: Canner/wren-engine PR: 1301
File: ibis-server/pyproject.toml:83-83
Timestamp: 2025-08-29T05:49:45.513Z
Learning: Polars dependency in ibis-server/pyproject.toml is correctly placed in dev dependencies because it's only used by helper tools in the tools/data_source/ directory, not in the main production application runtime.

Applied to files:

  • ibis-server/pyproject.toml
🧬 Code graph analysis (8)
wren-core-base/src/mdl/manifest.rs (1)
ibis-server/app/model/data_source.py (1)
  • DataSource (60-214)
ibis-server/app/model/metadata/factory.py (2)
ibis-server/app/model/metadata/databricks.py (1)
  • DatabricksMetadata (35-181)
ibis-server/app/model/data_source.py (1)
  • DataSource (60-214)
ibis-server/tests/routers/v3/connector/databricks/conftest.py (1)
ibis-server/tools/update_databricks_functions.py (1)
  • connect (43-66)
ibis-server/app/model/data_source.py (2)
ibis-server/app/model/__init__.py (2)
  • DatabricksConnectionInfo (367-382)
  • QueryDatabricksDTO (70-71)
ibis-server/tools/update_databricks_functions.py (1)
  • connect (43-66)
ibis-server/app/model/metadata/databricks.py (4)
ibis-server/app/model/__init__.py (1)
  • DatabricksConnectionInfo (367-382)
ibis-server/app/model/data_source.py (3)
  • DataSource (60-214)
  • get_connection (78-82)
  • get_connection (238-253)
ibis-server/app/model/metadata/dto.py (5)
  • Constraint (94-100)
  • ConstraintType (88-91)
  • RustWrenEngineColumnType (13-58)
  • Table (80-85)
  • TableProperties (70-77)
ibis-server/app/model/metadata/metadata.py (1)
  • Metadata (7-21)
ibis-server/tests/routers/v3/connector/databricks/test_metadata.py (2)
ibis-server/tests/conftest.py (1)
  • client (18-23)
ibis-server/tests/routers/v3/connector/databricks/conftest.py (1)
  • connection_info (55-60)
ibis-server/tests/routers/v3/connector/databricks/test_query.py (4)
wren-core-base/manifest-macro/src/lib.rs (1)
  • manifest (26-56)
ibis-server/tests/routers/v3/connector/databricks/test_function.py (1)
  • manifest_str (32-33)
ibis-server/tests/conftest.py (1)
  • client (18-23)
ibis-server/tests/routers/v3/connector/databricks/conftest.py (1)
  • connection_info (55-60)
ibis-server/tests/routers/v3/connector/databricks/test_function.py (3)
ibis-server/app/config.py (1)
  • get_config (98-99)
ibis-server/tests/conftest.py (2)
  • file_path (10-11)
  • client (18-23)
ibis-server/tests/routers/v3/connector/databricks/conftest.py (1)
  • connection_info (55-60)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: ci
🔇 Additional comments (22)
wren-core-base/manifest-macro/src/lib.rs (1)

111-112: LGTM!

The Databricks variant addition follows the established pattern and is positioned consistently with other data source variants.

wren-core-base/src/mdl/manifest.rs (1)

122-122: LGTM!

The Display implementation for Databricks is consistent with other data source variants.

ibis-server/pyproject.toml (3)

23-23: LGTM!

The databricks extra is correctly added to the ibis-framework dependency list.


52-52: LGTM!

The databricks-sql-connector dependency is correctly configured with pyarrow extras for efficient data transfer.


103-103: LGTM!

The pytest marker for databricks tests follows the established pattern.

ibis-server/app/model/metadata/factory.py (2)

6-6: LGTM!

The import statement is correctly placed and follows the alphabetical ordering.


39-39: LGTM!

The metadata factory mapping for Databricks is correctly configured.

ibis-server/app/model/__init__.py (3)

70-71: LGTM!

The QueryDatabricksDTO follows the established pattern for query DTOs.


367-383: LGTM!

The DatabricksConnectionInfo model is well-structured with appropriate use of SecretStr for sensitive fields and clear field documentation.


510-510: LGTM!

The ConnectionInfo union is correctly updated to include DatabricksConnectionInfo.

ibis-server/tests/routers/v3/connector/databricks/conftest.py (3)

1-9: LGTM!

The module setup correctly imports dependencies and configures the pytest marker for all tests.


19-51: LGTM!

The fixture properly sets up test schema and tables with appropriate constraints. The connection is correctly closed in the finally block. Using CREATE OR REPLACE ensures idempotency.


54-60: Verify environment variables are set before running tests.

The fixture retrieves environment variables without validation. Tests will fail with unclear errors if variables are missing.

Consider adding validation:

 @pytest.fixture(scope="module")
 def connection_info() -> dict[str, str]:
+    required_vars = ["DATABRICKS_SERVER_HOSTNAME", "DATABRICKS_HTTP_PATH", "DATABRICKS_TOKEN"]
+    missing = [var for var in required_vars if not os.getenv(var)]
+    if missing:
+        pytest.skip(f"Databricks tests require environment variables: {', '.join(missing)}")
+    
     return {
         "serverHostname": os.getenv("DATABRICKS_SERVER_HOSTNAME"),
         "httpPath": os.getenv("DATABRICKS_HTTP_PATH"),
         "accessToken": os.getenv("DATABRICKS_TOKEN"),
     }
ibis-server/tests/routers/v3/connector/databricks/test_metadata.py (4)

1-1: Verify the API version mismatch.

The test file is located in v3/connector/databricks/ but uses v2_base_url = "/v2/connector/databricks". Confirm whether metadata endpoints are intentionally still on v2 or if this should use v3.


4-30: LGTM!

The test comprehensively validates table metadata including structure, properties, and column details.


32-49: LGTM!

The test properly validates foreign key constraint metadata, confirming the constraint relationship between t2 and t1.


51-57: LGTM!

The test validates version retrieval. Consider adding validation of the version format if there's a predictable pattern.

ibis-server/app/model/data_source.py (5)

23-23: LGTM!

The Databricks imports are correctly placed and alphabetically ordered.

Also applies to: 35-35


76-76: LGTM!

The databricks data source is correctly added to the DataSource enum.


182-183: LGTM!

The connection info building for Databricks follows the established pattern.


233-233: LGTM!

The DataSourceExtension mapping for Databricks is correctly configured.


417-423: Verify timeout configuration for Databricks.

The connection method correctly implements the ibis.databricks.connect call. However, note that other data sources (postgres, clickhouse, trino, bigquery) have timeout configuration in the get_connection_info method (lines 90-135), but Databricks does not. Confirm whether Databricks requires query timeout configuration or if it's handled differently.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (1)
ibis-server/app/model/metadata/databricks.py (1)

41-60: Consider filtering additional system schemas.

The WHERE clause excludes information_schema, but Databricks also has system catalogs (such as system) that typically contain internal metadata tables. Consider whether these should also be filtered out to avoid exposing system internals in the metadata API.

             WHERE
-                c.TABLE_SCHEMA NOT IN ('information_schema')
+                c.TABLE_SCHEMA NOT IN ('information_schema')
+                AND c.TABLE_CATALOG NOT IN ('system')
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 55011b1 and b595c06.

📒 Files selected for processing (1)
  • ibis-server/app/model/metadata/databricks.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
ibis-server/app/model/metadata/databricks.py (4)
ibis-server/app/model/__init__.py (1)
  • DatabricksConnectionInfo (367-382)
ibis-server/app/model/data_source.py (3)
  • DataSource (60-214)
  • get_connection (78-82)
  • get_connection (238-253)
ibis-server/app/model/metadata/dto.py (5)
  • Constraint (94-100)
  • ConstraintType (88-91)
  • RustWrenEngineColumnType (13-58)
  • Table (80-85)
  • TableProperties (70-77)
ibis-server/app/model/metadata/metadata.py (1)
  • Metadata (7-21)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: check Cargo.toml formatting
  • GitHub Check: cargo test (amd64)
  • GitHub Check: cargo test (arm64)
  • GitHub Check: clippy
  • GitHub Check: ci
  • GitHub Check: test
🔇 Additional comments (4)
ibis-server/app/model/metadata/databricks.py (4)

35-38: LGTM!

The initialization properly extends the base Metadata class and establishes the Databricks connection using the standardized DataSource pattern.


61-99: LGTM!

The result processing correctly builds Table objects with proper handling for complex types (array, map, struct) and standard type transformation. The logic for grouping columns by table and handling nullability is sound.


148-153: LGTM!

The version retrieval correctly uses Databricks' current_version().dbsql_version function and extracts the scalar result appropriately.


155-184: LGTM!

The helper methods correctly format table/constraint names and transform Databricks data types to the internal enum representation. The special handling for parameterized types (decimal, geography, geometry) and logging for unknown types are appropriate.

@github-actions github-actions bot added the ci label Oct 31, 2025
@goldmedal
Copy link
Copy Markdown
Contributor Author

This PR has been tested locally

poetry run pytest -m 'databricks'
========================================================================================================== test session starts ==========================================================================================================
platform darwin -- Python 3.11.11, pytest-8.4.2, pluggy-1.6.0
rootdir: /Users/jax/git/wren-engine/ibis-server
configfile: pyproject.toml
plugins: anyio-4.10.0
collected 396 items / 382 deselected / 14 selected                                                                                                                                                                                      

tests/routers/v3/connector/databricks/test_function.py ...                                                                                                                                                                    [ 21%]
tests/routers/v3/connector/databricks/test_metadata.py ...                                                                                                                                                                        [ 42%]
tests/routers/v3/connector/databricks/test_query.py ........                                                                                                                                                                      [100%]

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
ibis-server/app/model/metadata/databricks.py (1)

15-32: Add missing interval type mapping.

The DATABRICKS_TYPE_MAPPING is missing the interval entry. A previous review comment on these lines was marked as addressed in commit b595c06, but the code still lacks this mapping. Databricks SQL supports INTERVAL types (year-month and day-time forms), and RustWrenEngineColumnType.INTERVAL exists in the enum. Without this mapping, interval columns will be classified as UNKNOWN by _transform_column_type (line 181).

Add the missing entry:

     "tinyint": RustWrenEngineColumnType.TINYINT,
     "variant": RustWrenEngineColumnType.VARIANT,
     "object": RustWrenEngineColumnType.JSON,
+    "interval": RustWrenEngineColumnType.INTERVAL,
 }
🧹 Nitpick comments (1)
ibis-server/app/model/metadata/databricks.py (1)

166-187: Consider documenting the fallback behavior.

The method logs a warning when encountering unmapped types (lines 184-185) and returns UNKNOWN. This is good defensive coding. Consider adding a docstring or inline comment explaining that array, map, and struct types are handled separately by the caller and should not be passed to this method, clarifying the expected input domain.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b595c06 and aa29a88.

📒 Files selected for processing (2)
  • .github/workflows/ibis-ci.yml (1 hunks)
  • ibis-server/app/model/metadata/databricks.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
ibis-server/app/model/metadata/databricks.py (4)
ibis-server/app/model/__init__.py (1)
  • DatabricksConnectionInfo (367-382)
ibis-server/app/model/data_source.py (3)
  • DataSource (60-214)
  • get_connection (78-82)
  • get_connection (238-253)
ibis-server/app/model/metadata/dto.py (5)
  • Constraint (94-100)
  • ConstraintType (88-91)
  • RustWrenEngineColumnType (13-58)
  • Table (80-85)
  • TableProperties (70-77)
ibis-server/app/model/metadata/metadata.py (1)
  • Metadata (7-21)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: check Cargo.toml formatting
  • GitHub Check: cargo test (amd64)
  • GitHub Check: clippy
  • GitHub Check: cargo test (arm64)
  • GitHub Check: test
  • GitHub Check: ci
🔇 Additional comments (2)
ibis-server/app/model/metadata/databricks.py (2)

85-88: Verify the approach for complex types.

Complex types (array, map, struct) are stored as raw string representations (e.g., "array<int>"), while scalar types are normalized to RustWrenEngineColumnType enum values. This inconsistency means downstream consumers must handle two different type representations.

Confirm whether:

  1. The string representation for complex types is the intended design.
  2. Downstream code (Rust engine, query processing) correctly handles both enum values and type strings.

If both representations are supported by design, consider adding a comment explaining this dual approach for future maintainers.


101-149: LGTM! Constraint query correctly handles catalog and schema boundaries.

The constraint extraction query now includes comprehensive join conditions on constraint_schema, table_catalog, table_schema, table_name (lines 115-118), and constraint_catalog, constraint_schema (lines 121-122). This prevents cross-catalog or cross-schema constraint mismatches, addressing all concerns raised in previous reviews.

@douenergy
Copy link
Copy Markdown
Contributor

Thanks @goldmedal

@douenergy douenergy merged commit 774e6fd into Canner:main Nov 7, 2025
10 of 11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci core dependencies Pull requests that update a dependency file ibis python Pull requests that update Python code rust Pull requests that update Rust code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants