Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Feb 12, 2025

What changes were proposed in this pull request?

This PR aims to remove TaskState.LOST logic from TaskSchedulerImpl from Apache Spark 4.1.0.

if (state == TaskState.LOST) {
// TaskState.LOST is only used by the deprecated Mesos fine-grained scheduling mode,

Why are the changes needed?

Since Apache Spark 4.0.0, Apache Mesos code is removed completely. Since TaskState.LOST is no longer used, we had better clean up this unused code to simplify the logic.

Please note that this PR didn't aim to remove LOST enum value from TaskState because it's exposed like the following although it's private[spark].

Screenshot 2025-02-12 at 13 11 03

Does this PR introduce any user-facing change?

No, behavior change.

How was this patch tested?

Pass the CIs.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the CORE label Feb 12, 2025
@dongjoon-hyun
Copy link
Member Author

Could you review this PR when you have some time, @viirya ?

@viirya
Copy link
Member

viirya commented Feb 12, 2025

Looks good to me. Thanks @dongjoon-hyun

@dongjoon-hyun
Copy link
Member Author

Thank you so much, @viirya !

@dongjoon-hyun
Copy link
Member Author

Merged to master for Apache Spark 4.1.0.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-51184 branch February 12, 2025 23:25
dongjoon-hyun pushed a commit that referenced this pull request Mar 9, 2025
…dulerImpl#statusUpdate`

### What changes were proposed in this pull request?
After SPARK-51184 (#49913), the variables `failedExecutor` and `reason` defined in the function `TaskSchedulerImpl#statusUpdate` are not reassigned after being initialized to `None`.

https://github.com/apache/spark/blob/f40bf4dd7166e86ea5cc9962e8c5b68b88f8dcb5/core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala#L783-L811

Therefore, `failedExecutor.isDefined` will always be false, and the following `if` branch will never be entered:

https://github.com/apache/spark/blob/f40bf4dd7166e86ea5cc9962e8c5b68b88f8dcb5/core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala#L812-L817.

So, the this pr cleans it up.

### Why are the changes needed?
Code cleanup.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Pass GitHub Actions

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #50218 from LuciferYang/ts-statusUpdate.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants