-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-3397] Bump pom.xml version number of master branch to 1.2.0-SNAPSHOT #2268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@witgo The release process does this, I believe. I don't think you need to open a PR for this especially before 1.1.0 is released |
|
@srowen I agree with you. But SparkContext.SPARK_VERSION has been modified to |
|
When we resolve and target jira for master its against 1.2, so why isn't the pom version updated? If its a release process detail, can we change it. I find it confusing that when I build against master its older then when I build against branch-1.1 |
|
Actually, I think this is the right fix. Commits that are targeted for 1.1.0 go into |
|
Yeah I think it's reasonable to bump the versions in master now rather than wait for the release. |
|
@pwendell do you just want to run the set version and commit it or do you want to do it through this jira? |
|
Checked with Patrick and this looks good, so I'm going to merge it. Thanks! |
|
It looks like this somehow broke the MiMa binary compatibility tests. I noticed that I merged this without first checking whether Jenkins had run; sorry about that! I'm going to dig in and see if I can figure out why this is causing failures... |
|
Aha! There was a "1.1" string in MimaExcludes: def excludes(version: String) =
version match {
case v if v.startsWith("1.1") =>
Seq(
MimaBuild.excludeSparkPackage("deploy"),
MimaBuild.excludeSparkPackage("graphx")
) ++
...I guess this logic serves to exclude certain incompatibilities introduced in Spark 1.1. What's the right fix here (beyond just rolling back the version numbers, which I'll probably do in the interim)? At the time of release, 1.1.0 should be binary-compatible with 1.0.0. Therefore, when testing 1.2.0-SNAPSHOT, is it safe to just update the previousSparkVersion to 1.1.0 and start with an empty set of excludes if /cc @pwendell |
By merging #2268, which bumped the Spark version to 1.2.0-SNAPSHOT, I inadvertently broke the Mima binary compatibility tests. The issue is that we were comparing 1.2.0-SNAPSHOT against Spark 1.0.0 without using any Mima excludes. The right long-term fix for this is probably to publish nightly snapshots on Maven central and change the master branch to test binary compatibility against the current release candidate branch's snapshots until that release is finalized. As a short-term fix until 1.1.0 is published on Maven central, I've configured the build to test the master branch for binary compatibility against the 1.1.0-RC4 jars. I'll loop back and remove the Apache staging repo as soon as 1.1.0 final is available. Author: Josh Rosen <[email protected]> Closes #2315 from JoshRosen/mima-fix and squashes the following commits: 776bc2c [Josh Rosen] Add two excludes to workaround Mima annotation issues. ec90e21 [Josh Rosen] Add deploy and graphx to 1.2 MiMa excludes. 57569be [Josh Rosen] Fix MiMa tests in master branch; test against 1.1.0 RC.
No description provided.