Skip to content

Commit 918e878

Browse files
committed
Document "one SparkContext per JVM" limitation.
1 parent afaa7e3 commit 918e878

File tree

3 files changed

+8
-0
lines changed

3 files changed

+8
-0
lines changed

core/src/main/scala/org/apache/spark/SparkContext.scala

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,9 @@ import org.apache.spark.util.{CallSite, ClosureCleaner, MetadataCleaner, Metadat
5757
* Main entry point for Spark functionality. A SparkContext represents the connection to a Spark
5858
* cluster, and can be used to create RDDs, accumulators and broadcast variables on that cluster.
5959
*
60+
* Only one SparkContext may be active per JVM. You must `stop()` the active SparkContext before
61+
* creating a new one. This limitation will eventually be removed; see SPARK-2243 for more details.
62+
*
6063
* @param config a Spark Config object describing the application configuration. Any settings in
6164
* this config overrides the default configs as well as system properties.
6265
*/

core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,9 @@ import org.apache.spark.rdd.{EmptyRDD, HadoopRDD, NewHadoopRDD, RDD}
4646
/**
4747
* A Java-friendly version of [[org.apache.spark.SparkContext]] that returns
4848
* [[org.apache.spark.api.java.JavaRDD]]s and works with Java collections instead of Scala ones.
49+
*
50+
* Only one SparkContext may be active per JVM. You must `stop()` the active SparkContext before
51+
* creating a new one. This limitation will eventually be removed; see SPARK-2243 for more details.
4952
*/
5053
class JavaSparkContext(val sc: SparkContext)
5154
extends JavaSparkContextVarargsWorkaround with Closeable {

docs/programming-guide.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -117,6 +117,8 @@ The first thing a Spark program must do is to create a [SparkContext](api/scala/
117117
how to access a cluster. To create a `SparkContext` you first need to build a [SparkConf](api/scala/index.html#org.apache.spark.SparkConf) object
118118
that contains information about your application.
119119

120+
Only one SparkContext may be active per JVM. You must `stop()` the active SparkContext before creating a new one.
121+
120122
{% highlight scala %}
121123
val conf = new SparkConf().setAppName(appName).setMaster(master)
122124
new SparkContext(conf)

0 commit comments

Comments
 (0)