-
Notifications
You must be signed in to change notification settings - Fork 9
chore: Leverage api extensions for partitioning #823
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis change refactors how partition columns and partition specifications are referenced throughout several Spark modules. Direct property accesses like Changes
Sequence Diagram(s)sequenceDiagram
participant Query
participant TableUtils
participant SparkComponent
SparkComponent->>Query: partitionSpec(tableUtils.partitionSpec)
Query->>TableUtils: Use tableUtils.partitionSpec as fallback
Query-->>SparkComponent: Return PartitionSpec (with .column)
SparkComponent->>SparkComponent: Use PartitionSpec.column for partition operations
Possibly related PRs
Suggested reviewers
Poem
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
dca80b4 to
98dacff
Compare
402b27b to
d060704
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
api/src/main/scala/ai/chronon/api/Extensions.scala (1)
1191-1193: Simplify lambda syntax.Remove unnecessary parentheses around lambda parameters.
- val column = Option(query).flatMap((q) => Option(q.partitionColumn)).getOrElse(defaultSpec.column) - val format = Option(query).flatMap((q) => Option(q.partitionFormat)).getOrElse(defaultSpec.format) - val interval = Option(query).flatMap((q) => Option(q.partitionInterval)).getOrElse(WindowUtils.Day) + val column = Option(query).flatMap(q => Option(q.partitionColumn)).getOrElse(defaultSpec.column) + val format = Option(query).flatMap(q => Option(q.partitionFormat)).getOrElse(defaultSpec.format) + val interval = Option(query).flatMap(q => Option(q.partitionInterval)).getOrElse(WindowUtils.Day)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
api/src/main/scala/ai/chronon/api/Extensions.scala(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
api/src/main/scala/ai/chronon/api/Extensions.scala (1)
api/src/main/java/ai/chronon/api/thrift/Option.java (1)
Option(25-143)
⏰ Context from checks skipped due to timeout of 90000ms (17)
- GitHub Check: spark_tests
- GitHub Check: batch_tests
- GitHub Check: streaming_tests
- GitHub Check: analyzer_tests
- GitHub Check: join_tests
- GitHub Check: groupby_tests
- GitHub Check: fetcher_tests
- GitHub Check: cloud_aws_tests
- GitHub Check: service_commons_tests
- GitHub Check: cloud_gcp_tests
- GitHub Check: service_tests
- GitHub Check: api_tests
- GitHub Check: online_tests
- GitHub Check: aggregator_tests
- GitHub Check: flink_tests
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (1)
api/src/main/scala/ai/chronon/api/Extensions.scala (1)
1191-1194: Good null safety enhancement.The flatMap pattern correctly prevents NPEs when
queryis null.
53ca504 to
d060704
Compare
Summary
Checklist
Summary by CodeRabbit