[HUDI-3859] Fix spark profiles and utilities-slim dep#5297
[HUDI-3859] Fix spark profiles and utilities-slim dep#5297yihua merged 3 commits intoapache:masterfrom
Conversation
| # Build against Spark 3.2.x (the default build shipped with the public Spark 3 bundle) | ||
| mvn clean package -DskipTests -Pspark3.2 | ||
| # Build against Spark 3.2.x | ||
| mvn clean package -DskipTests -Dspark3.2 -Dscala-2.12 |
There was a problem hiding this comment.
For spark3.2 and spark3.1, scala-2.12 is used by default so there is no need to provide that.
There was a problem hiding this comment.
spark3.2 sets the scala 12 dependencies but scala-2.12 additionally runs the enforcer plugin to guard the artifacts' suffix. so i think it's better to activate both profiles to build
There was a problem hiding this comment.
Should we have the enforcer plugin enabled for spark3.2 and related profiles as well? That'll make the command shorter for local builds. This can be a follow-up.
yihua
left a comment
There was a problem hiding this comment.
LGTM with one nit. I'll merge this now and any change addressing the nit can be a follow-up.
| # Build against Spark 3.2.x (the default build shipped with the public Spark 3 bundle) | ||
| mvn clean package -DskipTests -Pspark3.2 | ||
| # Build against Spark 3.2.x | ||
| mvn clean package -DskipTests -Dspark3.2 -Dscala-2.12 |
There was a problem hiding this comment.
Should we have the enforcer plugin enabled for spark3.2 and related profiles as well? That'll make the command shorter for local builds. This can be a follow-up.
| | Maven build options | Expected Spark bundle jar name | Notes | | ||
| |:--------------------------|:---------------------------------------------|:-------------------------------------------------| | ||
| | (empty) | hudi-spark-bundle_2.11 (legacy bundle name) | For Spark 2.4.4 and Scala 2.11 (default options) | | ||
| | `-Dspark2.4` | hudi-spark2.4-bundle_2.11 | For Spark 2.4.4 and Scala 2.11 (same as default) | |
There was a problem hiding this comment.
@xushiyan Hi, may I ask why we choose to use -D to specify the profile
I know there is such an activation by property in one profile
<activation>
<property>
<name>spark2.4</name>
</property>
</activation>Why not just use -PprofileName
Uh oh!
There was an error while loading. Please reload this page.