fix: reinstate harlequin-databricks as dependency & update mypy/ruff target python version to 3.9 #724
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Closes #XYZ (an existing open issue)
What are the key elements of this solution?
#714 removed harlequin-databricks as a dependency, probably due to a conflict with the databricks-sql-connector package placing a low upper limit on the pyarrow version
The latest releases of databricks-sql-connectorand harlequin-databricks remove upper bound specifications on dependencies as far as possible to avoid these sorts of dependency issues.
This PR reinstates harlequin-databricks as a dep of this project in the pyproject.toml & relocks with the latest poetry release.
Also bumps the target python version for ruff and mypy to 3.9 in line with #714
Why did you design your solution this way? Did you assess any alternatives? Are there tradeoffs?
Does this PR require a change to Harlequin's docs?
Did you add or update tests for this change?
Please complete the following checklist:
CHANGELOG.md, under the[Unreleased]section heading. That entry references the issue closed by this PR.