-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-32160][CORE][PYSPARK][3.0] Add a config to switch allow/disallow to create SparkContext in executors. #29294
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-32160][CORE][PYSPARK][3.0] Add a config to switch allow/disallow to create SparkContext in executors. #29294
Conversation
|
Test build #126794 has finished for PR 29294 at commit
|
|
Test build #126809 has finished for PR 29294 at commit
|
|
Jenkins, retest this please. |
|
Test build #126833 has finished for PR 29294 at commit
|
|
Jenkins, retest this please. |
|
Test build #126861 has finished for PR 29294 at commit
|
|
retest this please |
1 similar comment
|
retest this please |
|
Test build #126885 has finished for PR 29294 at commit
|
|
Merged to branch-3.0. |
…ow to create SparkContext in executors ### What changes were proposed in this pull request? This is a backport of #29278, but with allowing to create `SparkContext` in executors by default. This PR adds a config to switch allow/disallow to create `SparkContext` in executors. - `spark.driver.allowSparkContextInExecutors` ### Why are the changes needed? Some users or libraries actually create `SparkContext` in executors. We shouldn't break their workloads. ### Does this PR introduce _any_ user-facing change? Yes, users will be able to disallow to create `SparkContext` in executors with the config disabled. ### How was this patch tested? More tests are added. Closes #29294 from ueshin/issues/SPARK-32160/3.0/add_configs. Authored-by: Takuya UESHIN <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
|
For the configuration namespace, I have a question about that. Please see the original PR. |
|
Yeah, it makes sense. Thanks for pointing that out. |
… switch allow/disallow SparkContext in executors ### What changes were proposed in this pull request? This is a follow-up of #29294. This PR changes the config name to switch allow/disallow `SparkContext` in executors as per the comment #29278 (review). ### Why are the changes needed? The config name `spark.executor.allowSparkContext` is more reasonable. ### Does this PR introduce _any_ user-facing change? Yes, the config name is changed. ### How was this patch tested? Updated tests. Closes #29341 from ueshin/issues/SPARK-32160/3.0/change_config_name. Authored-by: Takuya UESHIN <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
What changes were proposed in this pull request?
This is a backport of #29278, but with allowing to create
SparkContextin executors by default.This PR adds a config to switch allow/disallow to create
SparkContextin executors.spark.driver.allowSparkContextInExecutorsWhy are the changes needed?
Some users or libraries actually create
SparkContextin executors.We shouldn't break their workloads.
Does this PR introduce any user-facing change?
Yes, users will be able to disallow to create
SparkContextin executors with the config disabled.How was this patch tested?
More tests are added.