Skip to content

Commit 1e3331e

Browse files
author
liguoqiang
committed
Merge branch 'master' into SPARK-1149
2 parents 3348619 + f65c1f3 commit 1e3331e

File tree

36 files changed

+756
-201
lines changed

36 files changed

+756
-201
lines changed

LICENSE

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -396,3 +396,35 @@ INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
396396
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
397397
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
398398
POSSIBILITY OF SUCH DAMAGE.
399+
400+
401+
========================================================================
402+
For sbt and sbt-launch-lib.bash in sbt/:
403+
========================================================================
404+
405+
// Generated from http://www.opensource.org/licenses/bsd-license.php
406+
Copyright (c) 2011, Paul Phillips.
407+
All rights reserved.
408+
409+
Redistribution and use in source and binary forms, with or without
410+
modification, are permitted provided that the following conditions are met:
411+
412+
* Redistributions of source code must retain the above copyright notice,
413+
this list of conditions and the following disclaimer.
414+
* Redistributions in binary form must reproduce the above copyright notice,
415+
this list of conditions and the following disclaimer in the documentation
416+
and/or other materials provided with the distribution.
417+
* Neither the name of the author nor the names of its contributors may be
418+
used to endorse or promote products derived from this software without
419+
specific prior written permission.
420+
421+
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
422+
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
423+
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
424+
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
425+
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
426+
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
427+
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
428+
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
429+
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
430+
EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

assembly/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,14 +21,14 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-assembly_2.10</artifactId>
3030
<name>Spark Project Assembly</name>
31-
<url>http://spark.incubator.apache.org/</url>
31+
<url>http://spark.apache.org/</url>
3232

3333
<properties>
3434
<spark.jar>${project.build.directory}/scala-${scala.binary.version}/${project.artifactId}-${project.version}-hadoop${hadoop.version}.jar</spark.jar>

bagel/pom.xml

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,29 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-bagel_2.10</artifactId>
3030
<packaging>jar</packaging>
3131
<name>Spark Project Bagel</name>
32-
<url>http://spark.incubator.apache.org/</url>
32+
<url>http://spark.apache.org/</url>
33+
34+
<profiles>
35+
<profile>
36+
<!-- SPARK-1121: SPARK-1121: Adds an explicit dependency on Avro to work around
37+
a Hadoop 0.23.X issue -->
38+
<id>yarn-alpha</id>
39+
<dependencies>
40+
<dependency>
41+
<groupId>org.apache.avro</groupId>
42+
<artifactId>avro</artifactId>
43+
</dependency>
44+
</dependencies>
45+
</profile>
46+
</profiles>
3347

3448
<dependencies>
3549
<dependency>

core/pom.xml

Lines changed: 25 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,29 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-core_2.10</artifactId>
3030
<packaging>jar</packaging>
3131
<name>Spark Project Core</name>
32-
<url>http://spark.incubator.apache.org/</url>
32+
<url>http://spark.apache.org/</url>
33+
34+
<!-- SPARK-1121: SPARK-1121: Adds an explicit dependency on Avro to work around
35+
a Hadoop 0.23.X issue -->
36+
<profiles>
37+
<profile>
38+
<id>yarn-alpha</id>
39+
<dependencies>
40+
<dependency>
41+
<groupId>org.apache.avro</groupId>
42+
<artifactId>avro</artifactId>
43+
</dependency>
44+
</dependencies>
45+
</profile>
46+
</profiles>
3347

3448
<dependencies>
3549
<dependency>
@@ -125,6 +139,15 @@
125139
<groupId>org.json4s</groupId>
126140
<artifactId>json4s-jackson_${scala.binary.version}</artifactId>
127141
<version>3.2.6</version>
142+
<!-- see also exclusion for lift-json; this is necessary since it depends on
143+
scala-library and scalap 2.10.0, but we use 2.10.3, and only override
144+
scala-library -->
145+
<exclusions>
146+
<exclusion>
147+
<groupId>org.scala-lang</groupId>
148+
<artifactId>scalap</artifactId>
149+
</exclusion>
150+
</exclusions>
128151
</dependency>
129152
<dependency>
130153
<groupId>it.unimi.dsi</groupId>

core/src/main/scala/org/apache/spark/deploy/master/ZooKeeperPersistenceEngine.scala

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -64,11 +64,11 @@ class ZooKeeperPersistenceEngine(serialization: Serialization, conf: SparkConf)
6464
override def readPersistedData(): (Seq[ApplicationInfo], Seq[DriverInfo], Seq[WorkerInfo]) = {
6565
val sortedFiles = zk.getChildren().forPath(WORKING_DIR).toList.sorted
6666
val appFiles = sortedFiles.filter(_.startsWith("app_"))
67-
val apps = appFiles.map(deserializeFromFile[ApplicationInfo])
67+
val apps = appFiles.map(deserializeFromFile[ApplicationInfo]).flatten
6868
val driverFiles = sortedFiles.filter(_.startsWith("driver_"))
69-
val drivers = driverFiles.map(deserializeFromFile[DriverInfo])
69+
val drivers = driverFiles.map(deserializeFromFile[DriverInfo]).flatten
7070
val workerFiles = sortedFiles.filter(_.startsWith("worker_"))
71-
val workers = workerFiles.map(deserializeFromFile[WorkerInfo])
71+
val workers = workerFiles.map(deserializeFromFile[WorkerInfo]).flatten
7272
(apps, drivers, workers)
7373
}
7474

@@ -78,10 +78,18 @@ class ZooKeeperPersistenceEngine(serialization: Serialization, conf: SparkConf)
7878
zk.create().withMode(CreateMode.PERSISTENT).forPath(path, serialized)
7979
}
8080

81-
def deserializeFromFile[T](filename: String)(implicit m: Manifest[T]): T = {
81+
def deserializeFromFile[T](filename: String)(implicit m: Manifest[T]): Option[T] = {
8282
val fileData = zk.getData().forPath(WORKING_DIR + "/" + filename)
8383
val clazz = m.runtimeClass.asInstanceOf[Class[T]]
8484
val serializer = serialization.serializerFor(clazz)
85-
serializer.fromBinary(fileData).asInstanceOf[T]
85+
try {
86+
Some(serializer.fromBinary(fileData).asInstanceOf[T])
87+
} catch {
88+
case e: Exception => {
89+
logWarning("Exception while reading persisted file, deleting", e)
90+
zk.delete().forPath(WORKING_DIR + "/" + filename)
91+
None
92+
}
93+
}
8694
}
8795
}

dev/audit-release/audit_release.py

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,10 @@
3131
import urllib2
3232

3333
## Fill in release details here:
34-
RELEASE_URL = "http://people.apache.org/~pwendell/spark-0.9.0-incubating-rc5/"
34+
RELEASE_URL = "http://people.apache.org/~pwendell/spark-1.0.0-rc1/"
3535
RELEASE_KEY = "9E4FE3AF"
3636
RELEASE_REPOSITORY = "https://repository.apache.org/content/repositories/orgapachespark-1006/"
37-
RELEASE_VERSION = "0.9.0-incubating"
37+
RELEASE_VERSION = "1.0.0"
3838
SCALA_VERSION = "2.10.3"
3939
SCALA_BINARY_VERSION = "2.10"
4040
##
@@ -191,10 +191,6 @@ def ensure_path_not_present(x):
191191
test("NOTICE" in base_files, "Tarball contains NOTICE file")
192192
test("LICENSE" in base_files, "Tarball contains LICENSE file")
193193

194-
os.chdir(os.path.join(WORK_DIR, dir_name))
195-
readme = "".join(open("README.md").readlines())
196-
disclaimer_part = "is an effort undergoing incubation"
197-
test(disclaimer_part in readme, "README file contains disclaimer")
198194
os.chdir(WORK_DIR)
199195

200196
for artifact in artifacts:

dev/create-release/create-release.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ scp spark* \
120120
# Docs
121121
cd spark
122122
cd docs
123-
jekyll build
123+
PRODUCTION=1 jekyll build
124124
echo "Copying release documentation"
125125
rc_docs_folder=${rc_folder}-docs
126126
rsync -r _site/* $USER_NAME@people.apache.org /home/$USER_NAME/public_html/$rc_docs_folder

docs/README.md

Lines changed: 16 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,22 @@ We include the Spark documentation as part of the source (as opposed to using a
1010

1111
In this directory you will find textfiles formatted using Markdown, with an ".md" suffix. You can read those text files directly if you want. Start with index.md.
1212

13-
To make things quite a bit prettier and make the links easier to follow, generate the html version of the documentation based on the src directory by running `jekyll build` in the docs directory. Use the command `SKIP_SCALADOC=1 jekyll build` to skip building and copying over the scaladoc which can be timely. To use the `jekyll` command, you will need to have Jekyll installed, the easiest way to do this is via a Ruby Gem, see the [jekyll installation instructions](http://jekyllrb.com/docs/installation). This will create a directory called _site containing index.html as well as the rest of the compiled files. Read more about Jekyll at https://github.com/mojombo/jekyll/wiki.
14-
15-
In addition to generating the site as html from the markdown files, jekyll can serve up the site via a webserver. To build and run a local webserver use the command `jekyll serve` (or the faster variant `SKIP_SCALADOC=1 jekyll serve`), which runs the webserver on port 4000, then visit the site at http://localhost:4000.
13+
The markdown code can be compiled to HTML using the
14+
[Jekyll tool](http://jekyllrb.com).
15+
To use the `jekyll` command, you will need to have Jekyll installed.
16+
The easiest way to do this is via a Ruby Gem, see the
17+
[jekyll installation instructions](http://jekyllrb.com/docs/installation).
18+
Compiling the site with Jekyll will create a directory called
19+
_site containing index.html as well as the rest of the compiled files.
20+
21+
You can modify the default Jekyll build as follows:
22+
23+
# Skip generating API docs (which takes a while)
24+
$ SKIP_SCALADOC=1 jekyll build
25+
# Serve content locally on port 4000
26+
$ jekyll serve --watch
27+
# Build the site with extra features used on the live page
28+
$ PRODUCTION=1 jekyll build
1629

1730
## Pygments
1831

docs/_layouts/global.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,9 +24,9 @@
2424

2525
<link rel="stylesheet" href="css/pygments-default.css">
2626

27+
{% production %}
2728
<!-- Google analytics script -->
2829
<script type="text/javascript">
29-
/*
3030
var _gaq = _gaq || [];
3131
_gaq.push(['_setAccount', 'UA-32518208-1']);
3232
_gaq.push(['_trackPageview']);
@@ -36,8 +36,8 @@
3636
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
3737
var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
3838
})();
39-
*/
4039
</script>
40+
{% endproduction %}
4141

4242
</head>
4343
<body>

docs/_plugins/production_tag.rb

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
module Jekyll
2+
class ProductionTag < Liquid::Block
3+
4+
def initialize(tag_name, markup, tokens)
5+
super
6+
end
7+
8+
def render(context)
9+
if ENV['PRODUCTION'] then super else "" end
10+
end
11+
end
12+
end
13+
14+
Liquid::Template.register_tag('production', Jekyll::ProductionTag)

0 commit comments

Comments
 (0)