-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE]: Migrate spark.table("db.table")
to spark.table("catalog.db.table")
#1082
Comments
As the list of candidate function calls eligible to migration grows, our currently minimalistic approach (checking just the name and arguments of the function being called) might increase the risk of unwanted migrations. |
that's correct.
created a separate issue to track this: |
@jimidle ^ |
My suggestion is for improved linting i.e. #1205, not for fixing. It's now merged. |
@jimidle @ericvergnaud sys.path manipulation and this feature are orthogonal. We already have the fixer framework and the example is with sql queries. The implementation that only looks at string constants is trivial and should not take more than few hours to implement and test. |
fixed in #1210 |
Is there an existing issue for this?
Problem statement
Every table in UC needs a catalog
Proposed Solution
Transform AST/CST with the migrated table index using the fixer framework declared in #1067
spark.table(...)
spark.read.table(...)
...write.saveAsTable(...)
Add another Linter/Fixer to https://github.com/databrickslabs/ucx/blob/main/src/databricks/labs/ucx/code/pyspark.py
Additional Context
No response
The text was updated successfully, but these errors were encountered: