Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,12 @@ class SparkSession private(
existingSharedState.getOrElse(new SharedState(sparkContext))
}

/**
* Initial options for session. This options are applied once when sessionState is created.
*/
@transient
private[sql] val initialSessionOptions = new scala.collection.mutable.HashMap[String, String]

/**
* State isolated across sessions, including SQL configurations, temporary tables, registered
* functions, and everything else that accepts a [[org.apache.spark.sql.internal.SQLConf]].
Expand All @@ -132,9 +138,11 @@ class SparkSession private(
parentSessionState
.map(_.clone(this))
.getOrElse {
SparkSession.instantiateSessionState(
val state = SparkSession.instantiateSessionState(
SparkSession.sessionStateClassName(sparkContext.conf),
self)
initialSessionOptions.foreach { case (k, v) => state.conf.setConfString(k, v) }
state
}
}

Expand Down Expand Up @@ -940,7 +948,7 @@ object SparkSession {
}

session = new SparkSession(sparkContext, None, None, extensions)
options.foreach { case (k, v) => session.sessionState.conf.setConfString(k, v) }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

options are not set by mergeSparkConf. Removing this line is wrong.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The original @cloud-fan 's fix is to set them to sparkContext.conf

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for review, @gatorsmile .
I see. Then, @cloud-fan meant the whole #18172 intead of that line.

options.foreach { case (k, v) => session.initialSessionOptions.put(k, v) }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to fix this problem in https://github.com/apache/spark/pull/18172/files#diff-d91c284798f1c98bf03a31855e26d71cL938 , maybe you can pick it up? Your approach adds a new concept(initialSessionOptions), while SparkSession initialization is already very complex...

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jul 2, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for review, @cloud-fan . You mean deleting this line simply, right?
That would be great.

defaultSession.set(session)

// Register a successfully instantiated context to the singleton. This should be at the
Expand Down