Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
f5428bc
feat(oracle): Implement views-only architecture with quick wins
jsatlien Oct 8, 2025
69f277d
Phase 4.3: Auto-detect view relationships via base table FK constraints
jsatlien Oct 9, 2025
7851d5f
fix: Oracle type mapper - remove unsupported precision/scale parameters
jsatlien Nov 17, 2025
f7f7ed1
feat(oracle): Add Oracle ADB integration with quoted identifiers and …
jsatlien Nov 18, 2025
6104a1b
feat(SCAIS-23): P1.001 - Create Oracle 19c dialect file structure
jsatlien Nov 21, 2025
ea9e372
feat(SCAIS-23): P1.002 - Register custom Oracle dialect
jsatlien Nov 21, 2025
11a01f6
feat(SCAIS-23): P1.004 - Add logging for dialect usage
jsatlien Nov 21, 2025
2e693f9
feat(oracle): add TYPE_MAPPING override for Oracle 19c boolean compat…
jsatlien Nov 24, 2025
5ae340d
feat(oracle): implement date add transform for Oracle 19c DAY unit
jsatlien Nov 24, 2025
23c2736
feat(oracle): implement date subtract transform for Oracle 19c DAY unit
jsatlien Nov 24, 2025
3b12b8b
feat(oracle): extend date arithmetic transforms for MONTH unit
jsatlien Nov 24, 2025
9473142
feat(oracle): extend date arithmetic transforms for YEAR unit
jsatlien Nov 24, 2025
6dc9f87
feat(oracle): register date arithmetic transforms in TRANSFORMS dicti…
jsatlien Nov 24, 2025
d947dd3
test(oracle): add pagination syntax validation tests
jsatlien Nov 24, 2025
cb8e8bb
fix(oracle): Extract raw Literal values in date arithmetic transforms
jsatlien Nov 24, 2025
6ff4401
test(oracle): Add comprehensive unit tests for Oracle 19c date arithm…
jsatlien Nov 25, 2025
511b15a
test(oracle): Improve test infrastructure and add type mapping tests
jsatlien Nov 25, 2025
6884dd7
style: Format test files (whitespace cleanup)
jsatlien Nov 25, 2025
b419f18
test(oracle): Add integration tests for Oracle 19c dialect
jsatlien Nov 25, 2025
398dacd
docs(oracle): Add comprehensive Oracle 19c dialect documentation
jsatlien Nov 25, 2025
bb07ffe
Fix Oracle 19c TIMESTAMPTZ compatibility
jsatlien Dec 19, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 16 additions & 6 deletions ibis-server/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM python:3.11-buster AS builder
FROM python:3.11-bookworm AS builder

ARG ENV
ENV ENV=$ENV
Expand Down Expand Up @@ -37,13 +37,13 @@ COPY . .
RUN just install --without dev


FROM python:3.11-slim-buster AS runtime
FROM python:3.11-slim-bookworm AS runtime

# Add microsoft package list
RUN apt-get update \
&& apt-get install -y curl gnupg \
&& curl https://packages.microsoft.com/keys/microsoft.asc | tee /etc/apt/trusted.gpg.d/microsoft.asc \
&& curl https://packages.microsoft.com/config/debian/11/prod.list | tee /etc/apt/sources.list.d/mssql-release.list \
&& curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /usr/share/keyrings/microsoft.gpg \
&& echo "deb [arch=amd64,arm64,armhf signed-by=/usr/share/keyrings/microsoft.gpg] https://packages.microsoft.com/debian/12/prod bookworm main" | tee /etc/apt/sources.list.d/mssql-release.list \
&& apt-get update

# Install msodbcsql 18 driver for mssql
Expand All @@ -53,12 +53,22 @@ RUN ACCEPT_EULA=Y apt-get -y install unixodbc-dev msodbcsql18
RUN apt-get install -y default-libmysqlclient-dev

# libpq-dev is required for psycopg2
RUN apt-get -y install libpq-dev \
RUN apt-get -y install libpq-dev

# Install Oracle Instant Client for Oracle database connections
RUN apt-get install -y wget unzip libaio1 \
&& wget https://download.oracle.com/otn_software/linux/instantclient/1923000/instantclient-basic-linux.x64-19.23.0.0.0dbru.zip \
&& unzip instantclient-basic-linux.x64-19.23.0.0.0dbru.zip -d /opt/ \
&& rm instantclient-basic-linux.x64-19.23.0.0.0dbru.zip \
&& echo "/opt/instantclient_19_23" > /etc/ld.so.conf.d/oracle-instantclient.conf \
&& ldconfig \
&& rm -rf /var/lib/apt/lists/*
Comment on lines +58 to 65
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Version mismatch: Oracle Instant Client 19.23.0.0.0 does not exist; should be 19.3.0.0.0 or consider upgrading to 21+ or 26ai.

The Dockerfile references version 19.23.0.0.0, which conflates Oracle Database 19c Release Update 19.23 with the Instant Client version. The correct Instant Client version in the 19c family is 19.3.0.0.0. Additionally, using a pinned version makes the Docker build fragile if Oracle removes access to older releases. Consider either:

  1. Upgrading to a newer Instant Client version (21 or 26ai are recommended)
  2. Correcting to 19.3.0.0.0 and making it configurable via build argument
  3. Migrating to RPM-based installation from yum.oracle.com for easier version management
🤖 Prompt for AI Agents
In ibis-server/Dockerfile around lines 58-65 the Dockerfile downloads a
non-existent Instant Client "19.23.0.0.0"; update the Dockerfile to use a valid
Instant Client version and make it configurable: add a build ARG like
INSTANT_CLIENT_VERSION (default to a supported release such as 19.3.0.0.0 or a
newer recommended release like 21 or 26ai), replace the hard-coded wget URL/zip
name with one constructed from that ARG, update the /opt/instantclient_<version>
path and filename usages accordingly, and optionally document or switch to the
yum.oracle.com RPM install flow if you prefer RPM-based management for easier
version upgrades and stability.


ENV VIRTUAL_ENV=/app/.venv \
PATH="/app/.venv/bin:$PATH" \
REMOTE_FUNCTION_LIST_PATH=/resources/function_list
REMOTE_FUNCTION_LIST_PATH=/resources/function_list \
LD_LIBRARY_PATH="/opt/instantclient_19_23:$LD_LIBRARY_PATH" \
ORACLE_HOME="/opt/instantclient_19_23"

COPY --from=builder ${VIRTUAL_ENV} ${VIRTUAL_ENV}
COPY app app
Expand Down
16 changes: 16 additions & 0 deletions ibis-server/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,22 @@ OpenTelemetry zero-code instrumentation is highly configurable. You can set the

[Metrics we are tracing right now](./Metrics.md)

## Oracle 19c Compatibility

WrenAI includes a custom Oracle 19c dialect override for generating compatible SQL syntax. See [Oracle 19c Dialect Documentation](docs/oracle_19c_dialect.md) for details.

### Testing Oracle 19c Dialect

Run the Oracle 19c test suite (34 tests):
```bash
# Using alias (if configured)
wren-test-oracle
Comment on lines +92 to +93
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# Search for the wren-test-oracle alias and related configuration
rg 'wren-test-oracle' --type-list
# Also search in common config files
fd -type f \( -name 'Makefile' -o -name 'justfile' -o -name '*.sh' -o -name '.*rc' \) | head -20

Repository: Canner/wren-engine

Length of output: 622


🏁 Script executed:

# Check if ibis-server/README.md exists and read the relevant section
if [ -f "ibis-server/README.md" ]; then
  echo "=== Lines 85-100 of ibis-server/README.md ==="
  sed -n '85,100p' ibis-server/README.md
  echo ""
  echo "=== Searching for wren-test-oracle in entire README ==="
  grep -n 'wren-test-oracle' ibis-server/README.md || echo "Not found in README"
fi

Repository: Canner/wren-engine

Length of output: 643


🏁 Script executed:

# Search for wren-test-oracle definition
rg 'wren-test-oracle' -A 3 -B 3

# Also check common configuration files
echo "=== Checking for Makefile ===" 
find . -name 'Makefile' -o -name 'justfile' | head -10

echo ""
echo "=== Checking for shell scripts with alias definitions ===" 
find . -type f -name '*.sh' | xargs grep -l 'alias' 2>/dev/null | head -10

echo ""
echo "=== Checking for .bashrc or .zshrc patterns ===" 
find . -type f \( -name '.bashrc' -o -name '.zshrc' -o -name '.aliases' \) 2>/dev/null

Repository: Canner/wren-engine

Length of output: 2008


🏁 Script executed:

# Check the justfile for alias or test commands
echo "=== ibis-server/justfile ==="
cat ibis-server/justfile

echo ""
echo "=== oracle_19c_dialect.md - Check configuration section ==="
sed -n '1,100p' ibis-server/docs/oracle_19c_dialect.md | tail -60

Repository: Canner/wren-engine

Length of output: 3308


Add setup instructions or link to oracle_19c_dialect.md documentation.

The wren-test-oracle alias requires user-level configuration in ~/.bashrc and is documented in ibis-server/docs/oracle_19c_dialect.md. The README should either:

  1. Link to docs/oracle_19c_dialect.md for setup instructions, or
  2. Document the alternative: cd ibis-server && just test oracle19c (no alias required)
🤖 Prompt for AI Agents
In ibis-server/README.md around lines 92-93, the reference to using the alias
"wren-test-oracle" lacks setup instructions; update the README to either add a
brief instruction or link to docs/oracle_19c_dialect.md explaining how to
configure the alias in ~/.bashrc, or provide the alternative command (cd
ibis-server && just test oracle19c) that avoids requiring the alias; ensure the
chosen option clearly states where to find more details and any required
environment variables or steps.


# Full command
cd ibis-server
PYTHONPATH=. poetry run pytest tests/custom_sqlglot/ -m oracle19c -v --confcutdir=tests/custom_sqlglot
```

## Contributing
Please see [CONTRIBUTING.md](docs/CONTRIBUTING.md) for more information.

Expand Down
1 change: 1 addition & 0 deletions ibis-server/app/custom_sqlglot/dialects/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# ruff: noqa: F401

from app.custom_sqlglot.dialects.mysql import MySQL
from app.custom_sqlglot.dialects.oracle import Oracle
93 changes: 93 additions & 0 deletions ibis-server/app/custom_sqlglot/dialects/oracle.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
from loguru import logger
from sqlglot import exp
from sqlglot.dialects.oracle import Oracle as OriginalOracle


class Oracle(OriginalOracle):
"""
Custom Oracle dialect for Oracle 19c compatibility.

Overrides SQLGlot's default Oracle dialect to fix specific Oracle 19c issues:
- TIMESTAMPTZ → TIMESTAMP type mapping (avoids timezone format issues)
- CAST timestamp literals with explicit TO_TIMESTAMP format (fixes ORA-01843)
- BOOLEAN → CHAR(1) type mapping (user's boolean representation pattern)

Note: INTERVAL syntax is fully supported in Oracle 19c and does not need transformation.

Based on SQLGlot version >=23.4,<26.5
"""

class Generator(OriginalOracle.Generator):
"""Custom generator for Oracle 19c SQL syntax."""

TYPE_MAPPING = {
**OriginalOracle.Generator.TYPE_MAPPING,
# Oracle 19c doesn't have native BOOLEAN type (21c+ feature)
# Map to CHAR(1) to match our 'Y'/'N' boolean representation pattern
exp.DataType.Type.BOOLEAN: "CHAR(1)",
# Map TIMESTAMPTZ to TIMESTAMP (without timezone) for Oracle 19c
# Avoids format conversion issues with TIMESTAMP WITH TIME ZONE
exp.DataType.Type.TIMESTAMPTZ: "TIMESTAMP",
}

TRANSFORMS = {
**OriginalOracle.Generator.TRANSFORMS,
# Handle CAST to TIMESTAMP with explicit format for string literals (ORA-01843 fix)
exp.Cast: lambda self, e: self._handle_cast_oracle19c(e),
}

def __init__(self, *args, **kwargs):
"""Initialize Oracle 19c generator with logging."""
super().__init__(*args, **kwargs)
logger.debug("Using custom Oracle 19c dialect for SQL generation")

def _handle_cast_oracle19c(self, expression: exp.Cast) -> str:
"""
Handle CAST expressions for Oracle 19c timestamp compatibility.

Oracle 19c cannot implicitly convert string literals like '2025-11-24 00:00:00'
when casting to TIMESTAMP. This transform converts:

CAST('2025-11-24 00:00:00' AS TIMESTAMP)
→ TO_TIMESTAMP('2025-11-24 00:00:00', 'YYYY-MM-DD HH24:MI:SS')

Only applies when:
- Source is a string literal
- Target type is TIMESTAMP or DATE
- Literal matches YYYY-MM-DD pattern

Args:
expression: Cast expression node

Returns:
Oracle 19c-compatible SQL string
"""
source = expression.this
target_type = expression.to

# Check if we're casting to TIMESTAMP or DATE
if target_type and target_type.this in (exp.DataType.Type.TIMESTAMP, exp.DataType.Type.DATE):
# Check if source is a string literal
if isinstance(source, exp.Literal) and source.is_string:
literal_value = source.this

# Check if it matches YYYY-MM-DD pattern (with or without time)
# Pattern: YYYY-MM-DD or YYYY-MM-DD HH:MI:SS
if literal_value and len(literal_value) >= 10 and literal_value[4] == '-' and literal_value[7] == '-':
# Determine format based on length
if len(literal_value) == 10:
# Just date: YYYY-MM-DD
format_mask = 'YYYY-MM-DD'
elif len(literal_value) == 19:
# Date with time: YYYY-MM-DD HH:MI:SS
format_mask = 'YYYY-MM-DD HH24:MI:SS'
else:
# Other length, use default CAST
return self.cast_sql(expression)

# Use TO_TIMESTAMP or TO_DATE with explicit format
func_name = "TO_TIMESTAMP" if target_type.this == exp.DataType.Type.TIMESTAMP else "TO_DATE"
return f"{func_name}('{literal_value}', '{format_mask}')"

# For all other cases, use default CAST behavior
return self.cast_sql(expression)
19 changes: 12 additions & 7 deletions ibis-server/app/model/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -203,19 +203,24 @@ class PostgresConnectionInfo(BaseConnectionInfo):


class OracleConnectionInfo(BaseConnectionInfo):
host: SecretStr = Field(
examples=["localhost"], description="the hostname of your database"
host: SecretStr | None = Field(
examples=["localhost"], description="the hostname of your database", default=None
)
port: SecretStr = Field(examples=[1521], description="the port of your database")
database: SecretStr = Field(
examples=["orcl"], description="the database name of your database"
port: SecretStr | None = Field(examples=[1521], description="the port of your database", default=None)
database: SecretStr | None = Field(
examples=["orcl"], description="the database name of your database", default=None
)
user: SecretStr = Field(
examples=["admin"], description="the username of your database"
)
password: SecretStr | None = Field(
examples=["password"], description="the password of your database", default=None
)
dsn: SecretStr | None = Field(
examples=["(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=host)(PORT=port))(CONNECT_DATA=(SERVICE_NAME=service)))"],
description="Oracle Data Source Name (DSN) - Alternative to host/port/database configuration",
default=None
)
Comment on lines +206 to +223
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing validation: either DSN or host/port/database must be provided.

The current model allows all connection parameters (dsn, host, port, database) to be None simultaneously, which would cause AttributeError when accessing .get_secret_value() on None in data_source.py.

Consider adding a Pydantic model_validator to ensure at least one valid connection method is configured:

🔎 Proposed fix using Pydantic model validator
+from pydantic import model_validator

 class OracleConnectionInfo(BaseConnectionInfo):
     host: SecretStr | None = Field(
         examples=["localhost"], description="the hostname of your database", default=None
     )
     port: SecretStr | None = Field(examples=[1521], description="the port of your database", default=None)
     database: SecretStr | None = Field(
         examples=["orcl"], description="the database name of your database", default=None
     )
     user: SecretStr = Field(
         examples=["admin"], description="the username of your database"
     )
     password: SecretStr | None = Field(
         examples=["password"], description="the password of your database", default=None
     )
     dsn: SecretStr | None = Field(
         examples=["(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=host)(PORT=port))(CONNECT_DATA=(SERVICE_NAME=service)))"],
         description="Oracle Data Source Name (DSN) - Alternative to host/port/database configuration",
         default=None
     )
+
+    @model_validator(mode='after')
+    def validate_connection_params(self):
+        if not self.dsn and not (self.host and self.port and self.database):
+            raise ValueError("Either 'dsn' or all of 'host', 'port', and 'database' must be provided")
+        return self

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In ibis-server/app/model/__init__.py around lines 206 to 223, the model
currently allows dsn, host, port and database to all be None which later causes
AttributeError when calling .get_secret_value(); add a Pydantic model_validator
(root/model-level validator) that checks: if dsn is not provided (None or empty)
then host, port and database must all be provided and non-null (and optionally
non-empty); if the check fails raise a ValueError with a clear message like
"Either dsn or host, port and database must be provided". Ensure the validator
runs after field parsing and returns the validated values unchanged when valid.



class SnowflakeConnectionInfo(BaseConnectionInfo):
Expand Down Expand Up @@ -261,10 +266,10 @@ class TrinoConnectionInfo(BaseConnectionInfo):

class LocalFileConnectionInfo(BaseConnectionInfo):
url: SecretStr = Field(
description="the root path of the local file", default="/", examples=["/data"]
description="the root path of the local file", examples=["/data"]
)
format: str = Field(
description="File format", default="csv", examples=["csv", "parquet", "json"]
description="File format", examples=["csv", "parquet", "json"]
)


Expand Down
13 changes: 13 additions & 0 deletions ibis-server/app/model/connector.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import base64
import importlib
import logging
from functools import cache
from json import loads
from typing import Any
Expand All @@ -17,6 +18,8 @@
from ibis.backends.sql.compilers.postgres import compiler as postgres_compiler
from opentelemetry import trace

logger = logging.getLogger(__name__)

from app.model import (
ConnectionInfo,
GcsFileConnectionInfo,
Expand All @@ -31,6 +34,10 @@
# Override datatypes of ibis
importlib.import_module("app.custom_ibis.backends.sql.datatypes")

# Apply Oracle backend patch to fix ORA-00923 error
from app.model.oracle_backend_patch import patch_oracle_backend
patch_oracle_backend()

tracer = trace.get_tracer(__name__)


Expand Down Expand Up @@ -70,10 +77,16 @@ def __init__(self, data_source: DataSource, connection_info: ConnectionInfo):

@tracer.start_as_current_span("connector_query", kind=trace.SpanKind.CLIENT)
def query(self, sql: str, limit: int) -> pd.DataFrame:
import sys
print(f"🔍 CONNECTOR QUERY: {sql}", file=sys.stderr, flush=True)
print(f"🔍 LIMIT: {limit}", file=sys.stderr, flush=True)
logger.info(f"🔍 CONNECTOR QUERY: {sql}")
logger.info(f"🔍 LIMIT: {limit}")
Comment on lines +80 to +84
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Remove temporary debug logging before merging.

These debug print statements appear to be temporary debugging code that should be removed before production:

  • Redundant logging (both print to stderr AND logger.info)
  • Uses emoji (🔍) which may not render correctly in all environments
  • Verbose logging for every query could impact performance and clutter logs
Suggested cleanup
     @tracer.start_as_current_span("connector_query", kind=trace.SpanKind.CLIENT)
     def query(self, sql: str, limit: int) -> pd.DataFrame:
-        import sys
-        print(f"🔍 CONNECTOR QUERY: {sql}", file=sys.stderr, flush=True)
-        print(f"🔍 LIMIT: {limit}", file=sys.stderr, flush=True)
-        logger.info(f"🔍 CONNECTOR QUERY: {sql}")
-        logger.info(f"🔍 LIMIT: {limit}")
+        logger.debug(f"Executing query with limit {limit}")
         return self.connection.sql(sql).limit(limit).to_pandas()

If query logging is needed for debugging, consider:

  1. Using logger.debug() instead of logger.info() to avoid cluttering production logs
  2. Removing the emoji
  3. Removing redundant stderr prints
  4. Only logging the SQL at debug level if truly needed
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import sys
print(f"🔍 CONNECTOR QUERY: {sql}", file=sys.stderr, flush=True)
print(f"🔍 LIMIT: {limit}", file=sys.stderr, flush=True)
logger.info(f"🔍 CONNECTOR QUERY: {sql}")
logger.info(f"🔍 LIMIT: {limit}")
logger.debug(f"Executing query with limit {limit}")
🤖 Prompt for AI Agents
In ibis-server/app/model/connector.py around lines 80 to 84, remove the
temporary debug prints to stderr and the emoji-laden info logs; replace them
with an optional debug-level logger call without emojis and without duplicate
printing. Specifically, delete the two print(..., file=sys.stderr) lines, change
logger.info to logger.debug, remove the emoji from the message, and only log the
SQL and limit at debug level (or remove entirely if not needed) so production
logs aren’t cluttered and performance isn’t impacted.

return self.connection.sql(sql).limit(limit).to_pandas()

@tracer.start_as_current_span("connector_dry_run", kind=trace.SpanKind.CLIENT)
def dry_run(self, sql: str) -> None:
logger.info(f"🔍 CONNECTOR DRY_RUN: {sql}")
self.connection.sql(sql)


Expand Down
25 changes: 18 additions & 7 deletions ibis-server/app/model/data_source.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,13 +181,24 @@ def get_postgres_connection(info: PostgresConnectionInfo) -> BaseBackend:

@staticmethod
def get_oracle_connection(info: OracleConnectionInfo) -> BaseBackend:
return ibis.oracle.connect(
host=info.host.get_secret_value(),
port=int(info.port.get_secret_value()),
database=info.database.get_secret_value(),
user=info.user.get_secret_value(),
password=(info.password and info.password.get_secret_value()),
)
# Build connection parameters
connect_params = {
"user": info.user.get_secret_value(),
"password": (info.password and info.password.get_secret_value()),
}

# If DSN is provided, use it directly
if info.dsn:
connect_params["dsn"] = info.dsn.get_secret_value()
else:
# Otherwise use individual host/port/database parameters
connect_params.update({
"host": info.host.get_secret_value(),
"port": int(info.port.get_secret_value()),
"database": info.database.get_secret_value(),
})
Comment on lines +193 to +199
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Potential AttributeError when host, port, or database are None.

When dsn is not provided, the code assumes host, port, and database are valid SecretStr objects. However, since these fields are now optional (SecretStr | None) in OracleConnectionInfo, calling .get_secret_value() on None will raise an AttributeError.

🔎 Proposed fix with explicit validation
         # If DSN is provided, use it directly
         if info.dsn:
             connect_params["dsn"] = info.dsn.get_secret_value()
         else:
             # Otherwise use individual host/port/database parameters
+            if not info.host or not info.port or not info.database:
+                raise ValueError(
+                    "Either 'dsn' or all of 'host', 'port', and 'database' must be provided"
+                )
             connect_params.update({
                 "host": info.host.get_secret_value(),
                 "port": int(info.port.get_secret_value()),
                 "database": info.database.get_secret_value(),
             })
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
else:
# Otherwise use individual host/port/database parameters
connect_params.update({
"host": info.host.get_secret_value(),
"port": int(info.port.get_secret_value()),
"database": info.database.get_secret_value(),
})
else:
# Otherwise use individual host/port/database parameters
if not info.host or not info.port or not info.database:
raise ValueError(
"Either 'dsn' or all of 'host', 'port', and 'database' must be provided"
)
connect_params.update({
"host": info.host.get_secret_value(),
"port": int(info.port.get_secret_value()),
"database": info.database.get_secret_value(),
})
🤖 Prompt for AI Agents
In ibis-server/app/model/data_source.py around lines 193 to 199, the code calls
.get_secret_value() on info.host/info.port/info.database but those fields are
now Optional (SecretStr | None) and can be None; update the branch that builds
connect_params to first validate that host, port, and database are not None,
raising a clear ValueError (or custom exception) if any are missing, then call
.get_secret_value() and convert port to int; ensure the error message names the
missing field(s) so callers know which parameter is absent.


return ibis.oracle.connect(**connect_params)

@staticmethod
def get_snowflake_connection(info: SnowflakeConnectionInfo) -> BaseBackend:
Expand Down
Loading