Skip to content

Commit ac046b3

Browse files
committed
Merge remote-tracking branch 'upstream/master' into security-branch-0.9-with-client-rebase
Conflicts: core/src/main/scala/org/apache/spark/broadcast/HttpBroadcast.scala core/src/main/scala/org/apache/spark/deploy/client/TestClient.scala core/src/main/scala/org/apache/spark/deploy/master/Master.scala core/src/main/scala/org/apache/spark/deploy/master/ui/MasterWebUI.scala core/src/main/scala/org/apache/spark/deploy/worker/ui/WorkerWebUI.scala core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala core/src/main/scala/org/apache/spark/metrics/sink/ConsoleSink.scala core/src/main/scala/org/apache/spark/metrics/sink/CsvSink.scala core/src/main/scala/org/apache/spark/metrics/sink/JmxSink.scala core/src/main/scala/org/apache/spark/metrics/sink/MetricsServlet.scala core/src/main/scala/org/apache/spark/network/Connection.scala core/src/main/scala/org/apache/spark/network/ConnectionManager.scala core/src/main/scala/org/apache/spark/network/ReceiverTest.scala core/src/main/scala/org/apache/spark/network/SenderTest.scala core/src/main/scala/org/apache/spark/storage/BlockManager.scala core/src/main/scala/org/apache/spark/storage/ThreadingTest.scala core/src/main/scala/org/apache/spark/ui/JettyUtils.scala core/src/main/scala/org/apache/spark/ui/SparkUI.scala core/src/main/scala/org/apache/spark/ui/jobs/JobProgressUI.scala core/src/main/scala/org/apache/spark/util/Utils.scala core/src/test/scala/org/apache/spark/metrics/MetricsSystemSuite.scala core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala core/src/test/scala/org/apache/spark/ui/UISuite.scala project/SparkBuild.scala yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala yarn/stable/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
2 parents 13733e1 + cda381f commit ac046b3

File tree

370 files changed

+4455
-2284
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

370 files changed

+4455
-2284
lines changed

LICENSE

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -396,3 +396,35 @@ INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
396396
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
397397
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
398398
POSSIBILITY OF SUCH DAMAGE.
399+
400+
401+
========================================================================
402+
For sbt and sbt-launch-lib.bash in sbt/:
403+
========================================================================
404+
405+
// Generated from http://www.opensource.org/licenses/bsd-license.php
406+
Copyright (c) 2011, Paul Phillips.
407+
All rights reserved.
408+
409+
Redistribution and use in source and binary forms, with or without
410+
modification, are permitted provided that the following conditions are met:
411+
412+
* Redistributions of source code must retain the above copyright notice,
413+
this list of conditions and the following disclaimer.
414+
* Redistributions in binary form must reproduce the above copyright notice,
415+
this list of conditions and the following disclaimer in the documentation
416+
and/or other materials provided with the distribution.
417+
* Neither the name of the author nor the names of its contributors may be
418+
used to endorse or promote products derived from this software without
419+
specific prior written permission.
420+
421+
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
422+
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
423+
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
424+
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
425+
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
426+
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
427+
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
428+
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
429+
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
430+
EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

README.md

Lines changed: 3 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
# Apache Spark
22

3-
Lightning-Fast Cluster Computing - <http://spark.incubator.apache.org/>
3+
Lightning-Fast Cluster Computing - <http://spark.apache.org/>
44

55

66
## Online Documentation
77

88
You can find the latest Spark documentation, including a programming
9-
guide, on the project webpage at <http://spark.incubator.apache.org/documentation.html>.
9+
guide, on the project webpage at <http://spark.apache.org/documentation.html>.
1010
This README file only contains basic setup instructions.
1111

1212

@@ -92,21 +92,10 @@ If your project is built with Maven, add this to your POM file's `<dependencies>
9292

9393
## Configuration
9494

95-
Please refer to the [Configuration guide](http://spark.incubator.apache.org/docs/latest/configuration.html)
95+
Please refer to the [Configuration guide](http://spark.apache.org/docs/latest/configuration.html)
9696
in the online documentation for an overview on how to configure Spark.
9797

9898

99-
## Apache Incubator Notice
100-
101-
Apache Spark is an effort undergoing incubation at The Apache Software
102-
Foundation (ASF), sponsored by the Apache Incubator. Incubation is required of
103-
all newly accepted projects until a further review indicates that the
104-
infrastructure, communications, and decision making process have stabilized in
105-
a manner consistent with other successful ASF projects. While incubation status
106-
is not necessarily a reflection of the completeness or stability of the code,
107-
it does indicate that the project has yet to be fully endorsed by the ASF.
108-
109-
11099
## Contributing to Spark
111100

112101
Contributions via GitHub pull requests are gladly accepted from their original

assembly/pom.xml

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,17 +21,20 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-assembly_2.10</artifactId>
3030
<name>Spark Project Assembly</name>
31-
<url>http://spark.incubator.apache.org/</url>
31+
<url>http://spark.apache.org/</url>
32+
<packaging>pom</packaging>
3233

3334
<properties>
34-
<spark.jar>${project.build.directory}/scala-${scala.binary.version}/${project.artifactId}-${project.version}-hadoop${hadoop.version}.jar</spark.jar>
35+
<spark.jar.dir>scala-${scala.binary.version}</spark.jar.dir>
36+
<spark.jar.basename>${project.artifactId}-${project.version}-hadoop${hadoop.version}.jar</spark.jar.basename>
37+
<spark.jar>${project.build.directory}/${spark.jar.dir}/${spark.jar.basename}</spark.jar>
3538
<deb.pkg.name>spark</deb.pkg.name>
3639
<deb.install.path>/usr/share/spark</deb.install.path>
3740
<deb.user>root</deb.user>

assembly/src/main/assembly/assembly.xml

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,15 @@
5555
<include>**/*</include>
5656
</includes>
5757
</fileSet>
58+
<fileSet>
59+
<directory>
60+
${project.parent.basedir}/assembly/target/${spark.jar.dir}
61+
</directory>
62+
<outputDirectory>/</outputDirectory>
63+
<includes>
64+
<include>${spark.jar.basename}</include>
65+
</includes>
66+
</fileSet>
5867
</fileSets>
5968

6069
<dependencySets>
@@ -75,6 +84,8 @@
7584
<excludes>
7685
<exclude>org.apache.hadoop:*:jar</exclude>
7786
<exclude>org.apache.spark:*:jar</exclude>
87+
<exclude>org.apache.zookeeper:*:jar</exclude>
88+
<exclude>org.apache.avro:*:jar</exclude>
7889
</excludes>
7990
</dependencySet>
8091
</dependencySets>

bagel/pom.xml

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,29 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-bagel_2.10</artifactId>
3030
<packaging>jar</packaging>
3131
<name>Spark Project Bagel</name>
32-
<url>http://spark.incubator.apache.org/</url>
32+
<url>http://spark.apache.org/</url>
33+
34+
<profiles>
35+
<profile>
36+
<!-- SPARK-1121: SPARK-1121: Adds an explicit dependency on Avro to work around
37+
a Hadoop 0.23.X issue -->
38+
<id>yarn-alpha</id>
39+
<dependencies>
40+
<dependency>
41+
<groupId>org.apache.avro</groupId>
42+
<artifactId>avro</artifactId>
43+
</dependency>
44+
</dependencies>
45+
</profile>
46+
</profiles>
3347

3448
<dependencies>
3549
<dependency>

bagel/src/main/scala/org/apache/spark/bagel/Bagel.scala

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ object Bagel extends Logging {
2727

2828
/**
2929
* Runs a Bagel program.
30-
* @param sc [[org.apache.spark.SparkContext]] to use for the program.
30+
* @param sc org.apache.spark.SparkContext to use for the program.
3131
* @param vertices vertices of the graph represented as an RDD of (Key, Vertex) pairs. Often the
3232
* Key will be the vertex id.
3333
* @param messages initial set of messages represented as an RDD of (Key, Message) pairs. Often
@@ -38,10 +38,10 @@ object Bagel extends Logging {
3838
* @param aggregator [[org.apache.spark.bagel.Aggregator]] performs a reduce across all vertices
3939
* after each superstep and provides the result to each vertex in the next
4040
* superstep.
41-
* @param partitioner [[org.apache.spark.Partitioner]] partitions values by key
41+
* @param partitioner org.apache.spark.Partitioner partitions values by key
4242
* @param numPartitions number of partitions across which to split the graph.
4343
* Default is the default parallelism of the SparkContext
44-
* @param storageLevel [[org.apache.spark.storage.StorageLevel]] to use for caching of
44+
* @param storageLevel org.apache.spark.storage.StorageLevel to use for caching of
4545
* intermediate RDDs in each superstep. Defaults to caching in memory.
4646
* @param compute function that takes a Vertex, optional set of (possibly combined) messages to
4747
* the Vertex, optional Aggregator and the current superstep,
@@ -131,7 +131,7 @@ object Bagel extends Logging {
131131

132132
/**
133133
* Runs a Bagel program with no [[org.apache.spark.bagel.Aggregator]], default
134-
* [[org.apache.spark.HashPartitioner]] and default storage level
134+
* org.apache.spark.HashPartitioner and default storage level
135135
*/
136136
def run[K: Manifest, V <: Vertex : Manifest, M <: Message[K] : Manifest, C: Manifest](
137137
sc: SparkContext,
@@ -146,7 +146,7 @@ object Bagel extends Logging {
146146

147147
/**
148148
* Runs a Bagel program with no [[org.apache.spark.bagel.Aggregator]] and the
149-
* default [[org.apache.spark.HashPartitioner]]
149+
* default org.apache.spark.HashPartitioner
150150
*/
151151
def run[K: Manifest, V <: Vertex : Manifest, M <: Message[K] : Manifest, C: Manifest](
152152
sc: SparkContext,
@@ -166,7 +166,7 @@ object Bagel extends Logging {
166166

167167
/**
168168
* Runs a Bagel program with no [[org.apache.spark.bagel.Aggregator]],
169-
* default [[org.apache.spark.HashPartitioner]],
169+
* default org.apache.spark.HashPartitioner,
170170
* [[org.apache.spark.bagel.DefaultCombiner]] and the default storage level
171171
*/
172172
def run[K: Manifest, V <: Vertex : Manifest, M <: Message[K] : Manifest](
@@ -180,7 +180,7 @@ object Bagel extends Logging {
180180

181181
/**
182182
* Runs a Bagel program with no [[org.apache.spark.bagel.Aggregator]],
183-
* the default [[org.apache.spark.HashPartitioner]]
183+
* the default org.apache.spark.HashPartitioner
184184
* and [[org.apache.spark.bagel.DefaultCombiner]]
185185
*/
186186
def run[K: Manifest, V <: Vertex : Manifest, M <: Message[K] : Manifest](

conf/spark-env.sh.template

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,3 +19,4 @@
1919
# - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT
2020
# - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
2121
# - SPARK_WORKER_DIR, to set the working directory of worker processes
22+
# - SPARK_PUBLIC_DNS, to set the public dns name of the master

core/pom.xml

Lines changed: 54 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,29 @@
2121
<parent>
2222
<groupId>org.apache.spark</groupId>
2323
<artifactId>spark-parent</artifactId>
24-
<version>1.0.0-incubating-SNAPSHOT</version>
24+
<version>1.0.0-SNAPSHOT</version>
2525
<relativePath>../pom.xml</relativePath>
2626
</parent>
2727

2828
<groupId>org.apache.spark</groupId>
2929
<artifactId>spark-core_2.10</artifactId>
3030
<packaging>jar</packaging>
3131
<name>Spark Project Core</name>
32-
<url>http://spark.incubator.apache.org/</url>
32+
<url>http://spark.apache.org/</url>
33+
34+
<!-- SPARK-1121: SPARK-1121: Adds an explicit dependency on Avro to work around
35+
a Hadoop 0.23.X issue -->
36+
<profiles>
37+
<profile>
38+
<id>yarn-alpha</id>
39+
<dependencies>
40+
<dependency>
41+
<groupId>org.apache.avro</groupId>
42+
<artifactId>avro</artifactId>
43+
</dependency>
44+
</dependencies>
45+
</profile>
46+
</profiles>
3347

3448
<dependencies>
3549
<dependency>
@@ -39,18 +53,16 @@
3953
<dependency>
4054
<groupId>net.java.dev.jets3t</groupId>
4155
<artifactId>jets3t</artifactId>
56+
<exclusions>
57+
<exclusion>
58+
<groupId>commons-logging</groupId>
59+
<artifactId>commons-logging</artifactId>
60+
</exclusion>
61+
</exclusions>
4262
</dependency>
4363
<dependency>
44-
<groupId>org.apache.avro</groupId>
45-
<artifactId>avro</artifactId>
46-
</dependency>
47-
<dependency>
48-
<groupId>org.apache.avro</groupId>
49-
<artifactId>avro-ipc</artifactId>
50-
</dependency>
51-
<dependency>
52-
<groupId>org.apache.zookeeper</groupId>
53-
<artifactId>zookeeper</artifactId>
64+
<groupId>org.apache.curator</groupId>
65+
<artifactId>curator-recipes</artifactId>
5466
</dependency>
5567
<dependency>
5668
<groupId>org.eclipse.jetty</groupId>
@@ -80,6 +92,22 @@
8092
<groupId>org.slf4j</groupId>
8193
<artifactId>slf4j-api</artifactId>
8294
</dependency>
95+
<dependency>
96+
<groupId>org.slf4j</groupId>
97+
<artifactId>jul-to-slf4j</artifactId>
98+
</dependency>
99+
<dependency>
100+
<groupId>org.slf4j</groupId>
101+
<artifactId>jcl-over-slf4j</artifactId>
102+
</dependency>
103+
<dependency>
104+
<groupId>log4j</groupId>
105+
<artifactId>log4j</artifactId>
106+
</dependency>
107+
<dependency>
108+
<groupId>org.slf4j</groupId>
109+
<artifactId>slf4j-log4j12</artifactId>
110+
</dependency>
83111
<dependency>
84112
<groupId>com.ning</groupId>
85113
<artifactId>compress-lzf</artifactId>
@@ -124,8 +152,18 @@
124152
<artifactId>scala-library</artifactId>
125153
</dependency>
126154
<dependency>
127-
<groupId>net.liftweb</groupId>
128-
<artifactId>lift-json_${scala.binary.version}</artifactId>
155+
<groupId>org.json4s</groupId>
156+
<artifactId>json4s-jackson_${scala.binary.version}</artifactId>
157+
<version>3.2.6</version>
158+
<!-- see also exclusion for lift-json; this is necessary since it depends on
159+
scala-library and scalap 2.10.0, but we use 2.10.3, and only override
160+
scala-library -->
161+
<exclusions>
162+
<exclusion>
163+
<groupId>org.scala-lang</groupId>
164+
<artifactId>scalap</artifactId>
165+
</exclusion>
166+
</exclusions>
129167
</dependency>
130168
<dependency>
131169
<groupId>it.unimi.dsi</groupId>
@@ -143,10 +181,6 @@
143181
<groupId>io.netty</groupId>
144182
<artifactId>netty-all</artifactId>
145183
</dependency>
146-
<dependency>
147-
<groupId>log4j</groupId>
148-
<artifactId>log4j</artifactId>
149-
</dependency>
150184
<dependency>
151185
<groupId>com.clearspring.analytics</groupId>
152186
<artifactId>stream</artifactId>
@@ -206,11 +240,6 @@
206240
<artifactId>junit-interface</artifactId>
207241
<scope>test</scope>
208242
</dependency>
209-
<dependency>
210-
<groupId>org.slf4j</groupId>
211-
<artifactId>slf4j-log4j12</artifactId>
212-
<scope>test</scope>
213-
</dependency>
214243
</dependencies>
215244
<build>
216245
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
@@ -227,7 +256,7 @@
227256
</goals>
228257
<configuration>
229258
<exportAntProperties>true</exportAntProperties>
230-
<tasks>
259+
<target>
231260
<property name="spark.classpath" refid="maven.test.classpath" />
232261
<property environment="env" />
233262
<fail message="Please set the SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment variables and retry.">
@@ -240,7 +269,7 @@
240269
</not>
241270
</condition>
242271
</fail>
243-
</tasks>
272+
</target>
244273
</configuration>
245274
</execution>
246275
</executions>

core/src/main/scala/org/apache/spark/api/java/StorageLevels.java renamed to core/src/main/java/org/apache/spark/api/java/StorageLevels.java

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -23,17 +23,17 @@
2323
* Expose some commonly useful storage level constants.
2424
*/
2525
public class StorageLevels {
26-
public static final StorageLevel NONE = new StorageLevel(false, false, false, 1);
27-
public static final StorageLevel DISK_ONLY = new StorageLevel(true, false, false, 1);
28-
public static final StorageLevel DISK_ONLY_2 = new StorageLevel(true, false, false, 2);
29-
public static final StorageLevel MEMORY_ONLY = new StorageLevel(false, true, true, 1);
30-
public static final StorageLevel MEMORY_ONLY_2 = new StorageLevel(false, true, true, 2);
31-
public static final StorageLevel MEMORY_ONLY_SER = new StorageLevel(false, true, false, 1);
32-
public static final StorageLevel MEMORY_ONLY_SER_2 = new StorageLevel(false, true, false, 2);
33-
public static final StorageLevel MEMORY_AND_DISK = new StorageLevel(true, true, true, 1);
34-
public static final StorageLevel MEMORY_AND_DISK_2 = new StorageLevel(true, true, true, 2);
35-
public static final StorageLevel MEMORY_AND_DISK_SER = new StorageLevel(true, true, false, 1);
36-
public static final StorageLevel MEMORY_AND_DISK_SER_2 = new StorageLevel(true, true, false, 2);
26+
public static final StorageLevel NONE = create(false, false, false, 1);
27+
public static final StorageLevel DISK_ONLY = create(true, false, false, 1);
28+
public static final StorageLevel DISK_ONLY_2 = create(true, false, false, 2);
29+
public static final StorageLevel MEMORY_ONLY = create(false, true, true, 1);
30+
public static final StorageLevel MEMORY_ONLY_2 = create(false, true, true, 2);
31+
public static final StorageLevel MEMORY_ONLY_SER = create(false, true, false, 1);
32+
public static final StorageLevel MEMORY_ONLY_SER_2 = create(false, true, false, 2);
33+
public static final StorageLevel MEMORY_AND_DISK = create(true, true, true, 1);
34+
public static final StorageLevel MEMORY_AND_DISK_2 = create(true, true, true, 2);
35+
public static final StorageLevel MEMORY_AND_DISK_SER = create(true, true, false, 1);
36+
public static final StorageLevel MEMORY_AND_DISK_SER_2 = create(true, true, false, 2);
3737

3838
/**
3939
* Create a new StorageLevel object.

0 commit comments

Comments
 (0)