Skip to content

Conversation

@WweiL
Copy link
Contributor

@WweiL WweiL commented Aug 4, 2023

What changes were proposed in this pull request?

Followup of this comment: #42283 (comment)
Change back the spark conf after creating streaming python process.

Why are the changes needed?

Bug fix

Does this PR introduce any user-facing change?

No

How was this patch tested?

Config only change

@WweiL WweiL changed the title finish [SPARK-44433][PYTHON][CONNECT][SS][FOLLOWUP]Set back USE_DAEMON after creating streaming python processes Aug 4, 2023
@WweiL
Copy link
Contributor Author

WweiL commented Aug 4, 2023

cc @ueshin Can you take a look? Thank you so much!

This doesn't need to goto 3.5. The change was added with the back port PR: https://github.com/apache/spark/pull/42340/files

Comment on lines 109 to 115
val prevConf = conf.get(PYTHON_USE_DAEMON)
conf.set(PYTHON_USE_DAEMON, false)
try {
SparkEnv.get.destroyPythonWorker(pythonExec, workerModule, envVars.asScala.toMap, worker)
} finally {
conf.set(PYTHON_USE_DAEMON, prevConf)
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this change is needed as env.destroyPythonWorker won't refer conf.

Copy link
Member

@ueshin ueshin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, pending tests.

@ueshin
Copy link
Member

ueshin commented Aug 5, 2023

Thanks! merging to master.

@ueshin ueshin closed this in cf64008 Aug 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants