Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update metadata export logic for join derivation #879

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

yuli-han
Copy link
Collaborator

@yuli-han yuli-han commented Nov 20, 2024

Summary

Update the metadata export logic for join with derivation. Use derived columns as exported features.

Why / Goal

Test Plan

  • Added Unit Tests
  • Covered by existing CI
  • Integration tested

Checklist

  • Documentation update

Reviewers

@hzding621 @pengyu-hou @SophieYu41

tableUtils.createDatabase(namespace)
val viewsGroupBy = getViewsGroupBy(suffix = "cumulative", makeCumulative = true, namespace)
val joinConf = getEventsEventsTemporal("cumulative", namespace)
joinConf.setJoinParts(Seq(Builders.JoinPart(groupBy = viewsGroupBy)).asJava)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add a test case that contains external parts?

StructType(derivedDummyOutputDf.schema.filterNot(f => keyAndPartitionFields.map(_.name).contains(f.name))))
ListBuffer(columns.map { tup => toAggregationMetadata(tup._1, tup._2, joinConf.hasDerivations) }: _*)
} else {
aggregationsMetadata
Copy link
Collaborator

@hzding621 hzding621 Nov 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: rename agg because agg is specific to group_by, but now we have external parts and derivations

  • aggMetadata => joinOutputFieldsMetadata
  • aggregationsMetadata => joinIntermediateFieldsMetadata

expression = "*"
), Derivation(
name = "test_feature_name",
expression = f"${viewsGroupBy.metaData.name}_time_spent_ms_average"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add some derivations that use ts and ds as inputs?

Copy link
Collaborator

@hzding621 hzding621 Nov 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly, can we add a test case for key columns as output?
such as

Derivation(
  name = "event_id",
  expression = "ext_contextual_event_id"
)

val finalOutputColumns = joinConf.derivationsScala.finalOutputColumn(dummyOutputDf.columns).toSeq
val derivedDummyOutputDf = dummyOutputDf.select(finalOutputColumns: _*)
val columns = SparkConversions.toChrononSchema(
StructType(derivedDummyOutputDf.schema.filterNot(f => keyAndPartitionFields.map(_.name).contains(f.name))))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the filterNot necessary here? I think we should keep everything that users included in derivations. For example, we should allow key columns to be in the output if users explicitly included it in derivations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants