-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-36566][K8S] Add Spark appname as a label to pods #34460
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc who marked SPARK-36566: @holdenk @HyukjinKwon |
|
Test build #144813 has finished for PR 34460 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Thank you, @Yikun . |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We had better make the PR and commit log independent as much as possible. Could you elaborate a little more about the difference between spark-app-selector and spark-app-name? It will become a helpful commit log for the other community members.
Sure, done! |
|
Test build #144828 has finished for PR 34460 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Need to address invalid label name... |
|
Test build #144840 has finished for PR 34460 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Label length limit: contain at most 63 characters, pod name limit: contain at most 253 characters |
|
Test build #144848 has finished for PR 34460 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
@dongjoon-hyun Looks like failure is unrelated, could you take a look again? Thanks! |
|
Sure, thank you for updating, @Yikun . |
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you, @Yikun .
Merged to master for Apache Spark 3.3.
…as application name than pod name ### Why are the changes needed? After apache/spark#34460 (Since Spark 3.3.0), the `spark-app-name` is available. We shall use it as the application name if it exists. ### How was this patch tested? Minor change. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #7034 from turboFei/k8s_app_name. Closes #7034 bfa88a4 [Wang, Fei] Get pod app name Authored-by: Wang, Fei <[email protected]> Signed-off-by: Wang, Fei <[email protected]>
…as application name than pod name ### Why are the changes needed? After apache/spark#34460 (Since Spark 3.3.0), the `spark-app-name` is available. We shall use it as the application name if it exists. ### How was this patch tested? Minor change. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #7034 from turboFei/k8s_app_name. Closes #7034 bfa88a4 [Wang, Fei] Get pod app name Authored-by: Wang, Fei <[email protected]> Signed-off-by: Wang, Fei <[email protected]> (cherry picked from commit cc68cb4) Signed-off-by: Wang, Fei <[email protected]>
…label as application name than pod name ### Why are the changes needed? After apache/spark#34460 (Since Spark 3.3.0), the `spark-app-name` is available. We shall use it as the application name if it exists. ### How was this patch tested? Minor change. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#7034 from turboFei/k8s_app_name. Closes apache#7034 bfa88a4 [Wang, Fei] Get pod app name Authored-by: Wang, Fei <[email protected]> Signed-off-by: Wang, Fei <[email protected]>
What changes were proposed in this pull request?
Add Spark appname as a label to pods.
Note that:
SPARK_APP_ID_LABEL: is the unique spark APP applicationId with spark prefix (like "spark-{applicationId}")SPARK_APP_NAME_LABELin this patch is the Spark APP name, it's more friendly and readable for k8s cluster maintainer to figure out the spark app name of specific pods.Why are the changes needed?
Then we can find out all pods (driver/executor) list by using:
k get pods -l spark.app.name=xxx, also can figure out the spark app name of specific pods.Does this PR introduce any user-facing change?
Add label to pods.
How was this patch tested?
Add UT