Skip to content

Conversation

@witgo
Copy link
Contributor

@witgo witgo commented Sep 4, 2014

No description provided.

@srowen
Copy link
Member

srowen commented Sep 4, 2014

@witgo The release process does this, I believe. I don't think you need to open a PR for this especially before 1.1.0 is released

@witgo
Copy link
Contributor Author

witgo commented Sep 4, 2014

@srowen I agree with you. But SparkContext.SPARK_VERSION has been modified to 1.2.0-SNAPSHOT

@tgravescs
Copy link
Contributor

When we resolve and target jira for master its against 1.2, so why isn't the pom version updated? If its a release process detail, can we change it. I find it confusing that when I build against master its older then when I build against branch-1.1

@JoshRosen
Copy link
Contributor

Actually, I think this is the right fix. Commits that are targeted for 1.1.0 go into branch-1.1, while development for 1.2.0 is continuing on master, so we probably should have updated the pom.xml versions when we created branch-1.1 itself, not when we released 1.1.0, so that the first commit/revision that's in 1.2.0 and not in 1.1.0 has the right version.

@pwendell
Copy link
Contributor

pwendell commented Sep 4, 2014

Yeah I think it's reasonable to bump the versions in master now rather than wait for the release.

@tgravescs
Copy link
Contributor

@pwendell do you just want to run the set version and commit it or do you want to do it through this jira?

@JoshRosen
Copy link
Contributor

Checked with Patrick and this looks good, so I'm going to merge it. Thanks!

@asfgit asfgit closed this in 607ae39 Sep 6, 2014
@witgo witgo deleted the SPARK-3397 branch September 7, 2014 02:21
@JoshRosen
Copy link
Contributor

It looks like this somehow broke the MiMa binary compatibility tests. I noticed that I merged this without first checking whether Jenkins had run; sorry about that! I'm going to dig in and see if I can figure out why this is causing failures...

@JoshRosen
Copy link
Contributor

Aha! There was a "1.1" string in MimaExcludes:

 def excludes(version: String) =
      version match {
        case v if v.startsWith("1.1") =>
          Seq(
            MimaBuild.excludeSparkPackage("deploy"),
            MimaBuild.excludeSparkPackage("graphx")
          ) ++
          ...

I guess this logic serves to exclude certain incompatibilities introduced in Spark 1.1.

What's the right fix here (beyond just rolling back the version numbers, which I'll probably do in the interim)?

At the time of release, 1.1.0 should be binary-compatible with 1.0.0. Therefore, when testing 1.2.0-SNAPSHOT, is it safe to just update the previousSparkVersion to 1.1.0 and start with an empty set of excludes if v.startsWith(1.2)? This might have to wait until 1.1.0 is published on Maven.

/cc @pwendell

asfgit pushed a commit that referenced this pull request Sep 8, 2014
By merging #2268, which bumped the Spark version to 1.2.0-SNAPSHOT, I inadvertently broke the Mima binary compatibility tests.  The issue is that we were comparing 1.2.0-SNAPSHOT against Spark 1.0.0 without using any Mima excludes.  The right long-term fix for this is probably to publish nightly snapshots on Maven central and change the master branch to test binary compatibility against the current release candidate branch's snapshots until that release is finalized.

As a short-term fix until 1.1.0 is published on Maven central, I've configured the build to test the master branch for binary compatibility against the 1.1.0-RC4 jars.  I'll loop back and remove the Apache staging repo as soon as 1.1.0 final is available.

Author: Josh Rosen <[email protected]>

Closes #2315 from JoshRosen/mima-fix and squashes the following commits:

776bc2c [Josh Rosen] Add two excludes to workaround Mima annotation issues.
ec90e21 [Josh Rosen] Add deploy and graphx to 1.2 MiMa excludes.
57569be [Josh Rosen] Fix MiMa tests in master branch; test against 1.1.0 RC.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants