-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-31949][SQL] Add spark.default.parallelism in SQLConf for isolated across session #28778
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -180,6 +180,15 @@ class SparkSession private( | |
| */ | ||
| @transient lazy val conf: RuntimeConfig = new RuntimeConfig(sessionState.conf) | ||
|
|
||
| /** | ||
| * Same as `spark.default.parallelism`, can be isolated across sessions. | ||
| * | ||
| * @since 3.1.0 | ||
| */ | ||
| def defaultParallelism: Int = { | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'd like to not have this API, as |
||
| sessionState.conf.defaultParallelism.getOrElse(sparkContext.defaultParallelism) | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. so we add a config, whose only usage is to let users get the config value?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As I said above. If add this config, I will move the exists
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. just do this in this pr ?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. please do, otherwise it's a useless config |
||
| } | ||
|
|
||
| /** | ||
| * An interface to register custom [[org.apache.spark.sql.util.QueryExecutionListener]]s | ||
| * that listen for execution metrics. | ||
|
|
@@ -513,7 +522,7 @@ class SparkSession private( | |
| * @since 2.0.0 | ||
| */ | ||
| def range(start: Long, end: Long): Dataset[java.lang.Long] = { | ||
| range(start, end, step = 1, numPartitions = sparkContext.defaultParallelism) | ||
| range(start, end, step = 1, numPartitions = defaultParallelism) | ||
| } | ||
|
|
||
| /** | ||
|
|
@@ -523,7 +532,7 @@ class SparkSession private( | |
| * @since 2.0.0 | ||
| */ | ||
| def range(start: Long, end: Long, step: Long): Dataset[java.lang.Long] = { | ||
| range(start, end, step, numPartitions = sparkContext.defaultParallelism) | ||
| range(start, end, step, numPartitions = defaultParallelism) | ||
| } | ||
|
|
||
| /** | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark.sql.default.parallelism->spark.sql.sessionLocalDefaultParallelism?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Em.. is it better to keep similar with
spark.default.parallelism? so we can set this config easy.sessionLocalDefaultParallelismseems complex.