Skip to content

Commit 912563a

Browse files
sryzaAndrew Or
authored andcommitted
SPARK-4338. [YARN] Ditch yarn-alpha.
Sorry if this is a little premature with 1.2 still not out the door, but it will make other work like SPARK-4136 and SPARK-2089 a lot easier. Author: Sandy Ryza <[email protected]> Closes #3215 from sryza/sandy-spark-4338 and squashes the following commits: 1c5ac08 [Sandy Ryza] Update building Spark docs and remove unnecessary newline 9c1421c [Sandy Ryza] SPARK-4338. Ditch yarn-alpha.
1 parent 383c555 commit 912563a

37 files changed

+96
-928
lines changed

assembly/pom.xml

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -169,16 +169,6 @@
169169
</build>
170170

171171
<profiles>
172-
<profile>
173-
<id>yarn-alpha</id>
174-
<dependencies>
175-
<dependency>
176-
<groupId>org.apache.spark</groupId>
177-
<artifactId>spark-yarn-alpha_${scala.binary.version}</artifactId>
178-
<version>${project.version}</version>
179-
</dependency>
180-
</dependencies>
181-
</profile>
182172
<profile>
183173
<id>yarn</id>
184174
<dependencies>

dev/scalastyle

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,8 @@
1818
#
1919

2020
echo -e "q\n" | sbt/sbt -Phive -Phive-thriftserver scalastyle > scalastyle.txt
21-
# Check style with YARN alpha built too
22-
echo -e "q\n" | sbt/sbt -Pyarn-alpha -Phadoop-0.23 -Dhadoop.version=0.23.9 yarn-alpha/scalastyle \
23-
>> scalastyle.txt
2421
# Check style with YARN built too
25-
echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 yarn/scalastyle \
22+
echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 scalastyle \
2623
>> scalastyle.txt
2724

2825
ERRORS=$(cat scalastyle.txt | awk '{if($1~/error/)print}')

docs/building-spark.md

Lines changed: 2 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -60,32 +60,11 @@ mvn -Dhadoop.version=2.0.0-mr1-cdh4.2.0 -DskipTests clean package
6060
mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
6161
{% endhighlight %}
6262

63-
For Apache Hadoop 2.x, 0.23.x, Cloudera CDH, and other Hadoop versions with YARN, you can enable the "yarn-alpha" or "yarn" profile and optionally set the "yarn.version" property if it is different from "hadoop.version". The additional build profile required depends on the YARN version:
64-
65-
<table class="table">
66-
<thead>
67-
<tr><th>YARN version</th><th>Profile required</th></tr>
68-
</thead>
69-
<tbody>
70-
<tr><td>0.23.x to 2.1.x</td><td>yarn-alpha (Deprecated.)</td></tr>
71-
<tr><td>2.2.x and later</td><td>yarn</td></tr>
72-
</tbody>
73-
</table>
74-
75-
Note: Support for YARN-alpha API's will be removed in Spark 1.3 (see SPARK-3445).
63+
For Apache Hadoop 2.x, 0.23.x, Cloudera CDH, and other Hadoop versions with YARN, you can enable the "yarn" profile and optionally set the "yarn.version" property if it is different from "hadoop.version". As of Spark 1.3, Spark only supports YARN versions 2.2.0 and later.
7664

7765
Examples:
7866

7967
{% highlight bash %}
80-
# Apache Hadoop 2.0.5-alpha
81-
mvn -Pyarn-alpha -Dhadoop.version=2.0.5-alpha -DskipTests clean package
82-
83-
# Cloudera CDH 4.2.0
84-
mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -DskipTests clean package
85-
86-
# Apache Hadoop 0.23.x
87-
mvn -Pyarn-alpha -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
88-
8968
# Apache Hadoop 2.2.X
9069
mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package
9170

@@ -99,7 +78,7 @@ Versions of Hadoop after 2.5.X may or may not work with the -Phadoop-2.4 profile
9978
released after this version of Spark).
10079

10180
# Different versions of HDFS and YARN.
102-
mvn -Pyarn-alpha -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 -DskipTests clean package
81+
mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=2.2.0 -DskipTests clean package
10382
{% endhighlight %}
10483

10584
# Building With Hive and JDBC Support

docs/running-on-yarn.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -216,7 +216,7 @@ If you need a reference to the proper location to put log files in the YARN so t
216216

217217
# Important notes
218218

219-
- Before Hadoop 2.2, YARN does not support cores in container resource requests. Thus, when running against an earlier version, the numbers of cores given via command line arguments cannot be passed to YARN. Whether core requests are honored in scheduling decisions depends on which scheduler is in use and how it is configured.
219+
- Whether core requests are honored in scheduling decisions depends on which scheduler is in use and how it is configured.
220220
- The local directories used by Spark executors will be the local directories configured for YARN (Hadoop YARN config `yarn.nodemanager.local-dirs`). If the user specifies `spark.local.dir`, it will be ignored.
221221
- The `--files` and `--archives` options support specifying file names with the # similar to Hadoop. For example you can specify: `--files localtest.txt#appSees.txt` and this will upload the file you have locally named localtest.txt into HDFS but this will be linked to by the name `appSees.txt`, and your application should use the name as `appSees.txt` to reference it when running on YARN.
222222
- The `--jars` option allows the `SparkContext.addJar` function to work if you are using it with local files and running in `yarn-cluster` mode. It does not need to be used if you are using it with HDFS, HTTP, HTTPS, or FTP files.

pom.xml

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1293,13 +1293,6 @@
12931293
</properties>
12941294
</profile>
12951295

1296-
<profile>
1297-
<id>yarn-alpha</id>
1298-
<modules>
1299-
<module>yarn</module>
1300-
</modules>
1301-
</profile>
1302-
13031296
<profile>
13041297
<id>yarn</id>
13051298
<modules>

project/SparkBuild.scala

Lines changed: 7 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,9 @@ object BuildCommons {
3838
"streaming-flume", "streaming-kafka", "streaming-mqtt", "streaming-twitter",
3939
"streaming-zeromq").map(ProjectRef(buildLocation, _))
4040

41-
val optionallyEnabledProjects@Seq(yarn, yarnStable, yarnAlpha, java8Tests,
42-
sparkGangliaLgpl, sparkKinesisAsl) = Seq("yarn", "yarn-stable", "yarn-alpha",
43-
"java8-tests", "ganglia-lgpl", "kinesis-asl").map(ProjectRef(buildLocation, _))
41+
val optionallyEnabledProjects@Seq(yarn, yarnStable, java8Tests, sparkGangliaLgpl,
42+
sparkKinesisAsl) = Seq("yarn", "yarn-stable", "java8-tests", "ganglia-lgpl",
43+
"kinesis-asl").map(ProjectRef(buildLocation, _))
4444

4545
val assemblyProjects@Seq(assembly, examples, networkYarn) =
4646
Seq("assembly", "examples", "network-yarn").map(ProjectRef(buildLocation, _))
@@ -79,14 +79,8 @@ object SparkBuild extends PomBuild {
7979
case None =>
8080
}
8181
if (Properties.envOrNone("SPARK_YARN").isDefined) {
82-
if(isAlphaYarn) {
83-
println("NOTE: SPARK_YARN is deprecated, please use -Pyarn-alpha flag.")
84-
profiles ++= Seq("yarn-alpha")
85-
}
86-
else {
87-
println("NOTE: SPARK_YARN is deprecated, please use -Pyarn flag.")
88-
profiles ++= Seq("yarn")
89-
}
82+
println("NOTE: SPARK_YARN is deprecated, please use -Pyarn flag.")
83+
profiles ++= Seq("yarn")
9084
}
9185
profiles
9286
}
@@ -335,9 +329,9 @@ object Unidoc {
335329
publish := {},
336330

337331
unidocProjectFilter in(ScalaUnidoc, unidoc) :=
338-
inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, catalyst, streamingFlumeSink, yarn, yarnAlpha),
332+
inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, catalyst, streamingFlumeSink, yarn),
339333
unidocProjectFilter in(JavaUnidoc, unidoc) :=
340-
inAnyProject -- inProjects(OldDeps.project, repl, bagel, examples, tools, catalyst, streamingFlumeSink, yarn, yarnAlpha),
334+
inAnyProject -- inProjects(OldDeps.project, repl, bagel, examples, tools, catalyst, streamingFlumeSink, yarn),
341335

342336
// Skip class names containing $ and some internal packages in Javadocs
343337
unidocAllSources in (JavaUnidoc, unidoc) := {

yarn/README.md

Lines changed: 0 additions & 12 deletions
This file was deleted.

yarn/alpha/pom.xml

Lines changed: 0 additions & 35 deletions
This file was deleted.

yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/Client.scala

Lines changed: 0 additions & 145 deletions
This file was deleted.

0 commit comments

Comments
 (0)