Skip to content

Conversation

@AngersZhuuuu
Copy link
Contributor

@AngersZhuuuu AngersZhuuuu commented Feb 21, 2022

What changes were proposed in this pull request?

Current Spark SQL CLI alway use shutdown hook to stop SparkSQLEnv

 // Clean up after we exit
    ShutdownHookManager.addShutdownHook { () => SparkSQLEnv.stop() }

but use process ret to close client side jvm

while (line != null) {
  if (!line.startsWith("--")) {
    if (prefix.nonEmpty) {
      prefix += '\n'
     } 
    if (line.trim().endsWith(";") && !line.trim().endsWith("\\;")) {
      line = prefix + line
      ret = cli.processLine(line, true)
      prefix = ""
      currentPrompt = promptWithCurrentDB
    } else {
      prefix = prefix + line
      currentPrompt = continuedPromptWithDBSpaces
    }
  }
  line = reader.readLine(currentPrompt + "> ")
}

sessionState.close()

System.exit(ret)
}
    if (sessionState.execString != null) {
      exitCode = cli.processLine(sessionState.execString)
      System.exit(exitCode)
    }

    try {
      if (sessionState.fileName != null) {
        exitCode = cli.processFile(sessionState.fileName)
        System.exit(exitCode)
      }
    } catch {
      case e: FileNotFoundException =>
        logError(s"Could not open input file for reading. (${e.getMessage})")
        exitCode = 3
        System.exit(exitCode)
    }

This always cause client side exit code not consistent with AM.

IN this pr I prefer to pass the exit code to SparkContext.stop() method to pass a clear client side status to AM side in client mode.

In this pr, I add a new stop method in SchedulerBackend

def stop(exitCode: Int): Unit = stop()

So we don't need to implement it for all kinds of scheduler backend.
In this pr, we only handle the case me meet for YarnClientSchedulerBackend, then we can only implement stop(exitCode: Int) for this class, then for yarn client mode, it can work now.

I think this can benefit many similar case in future.

Why are the changes needed?

Keep client side status consistent with AM side

Does this PR introduce any user-facing change?

With this pr, client side status will be same with am side

How was this patch tested?

MT

@LuciferYang
Copy link
Contributor

IN this pr I prefer to pass the exit code to SparkContext.stop() method to pass a clear cliet side status to AM side in client mode. And I think this can benefit many similar case in future.

cliet -> client

@github-actions github-actions bot added the BUILD label Feb 22, 2022
@AngersZhuuuu
Copy link
Contributor Author

IN this pr I prefer to pass the exit code to SparkContext.stop() method to pass a clear cliet side status to AM side in client mode. And I think this can benefit many similar case in future.

cliet -> client

updated

@AngersZhuuuu
Copy link
Contributor Author

gentle ping @cloud-fan @tgravescs @mridulm

@cloud-fan
Copy link
Contributor

cloud-fan commented Feb 22, 2022

cc @bogdanghit @yaooqinn

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM +1

@AngersZhuuuu
Copy link
Contributor Author

gentle ping @cloud-fan @bogdanghit

@bogdanghit
Copy link
Contributor

@AngersZhuuuu does this address the CliSuite flakiness I reported a while back?

@AngersZhuuuu
Copy link
Contributor Author

@AngersZhuuuu does this address the CliSuite flakiness I reported a while back?

Can you provide the link? Literally, it should be irrelevant

@AngersZhuuuu
Copy link
Contributor Author

gentle ping @bogdanghit

@github-actions
Copy link

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

@github-actions github-actions bot added the Stale label Sep 27, 2022
@github-actions github-actions bot closed this Sep 28, 2022
@cloud-fan cloud-fan removed the Stale label Sep 28, 2022
@cloud-fan cloud-fan reopened this Sep 28, 2022
@cloud-fan
Copy link
Contributor

@AngersZhuuuu can you rebase? We should merge this PR.

@yaooqinn
Copy link
Member

+1, and sorry for not merging it after my approval

@AngersZhuuuu
Copy link
Contributor Author

@AngersZhuuuu can you rebase? We should merge this PR.

Done

@AngersZhuuuu
Copy link
Contributor Author

ping @cloud-fan @yaooqinn @LuciferYang

@cloud-fan
Copy link
Contributor

thanks, merging to master!

@cloud-fan cloud-fan closed this in 026b3d4 Nov 1, 2022
HyukjinKwon added a commit that referenced this pull request Nov 3, 2022
…tElementNames in Mima for Scala 2.13

### What changes were proposed in this pull request?
This PR is a followup of #35594 that recovers Mima compatibility test for Scala 2.13.

### Why are the changes needed?

To fix the Mima build broken (https://github.com/apache/spark/actions/runs/3380379538/jobs/5613108397)

```
[error] spark-core: Failed binary compatibility check against org.apache.spark:spark-core_2.13:3.3.0! Found 2 potential problems (filtered 945)
[error]  * method productElementName(Int)java.lang.String in object org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementName")
[error]  * method productElementNames()scala.collection.Iterator in object org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementNames")
```

### Does this PR introduce _any_ user-facing change?
No, dev-only.

### How was this patch tested?
CI in this PR should test it out. After that, scheduled jobs for Scala 2.13 will test this out

Closes #38492 from HyukjinKwon/SPARK-38270-followup.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
SandishKumarHN pushed a commit to SandishKumarHN/spark that referenced this pull request Dec 12, 2022
… client side

### What changes were proposed in this pull request?
Current Spark SQL CLI alway use  shutdown hook to stop SparkSQLEnv
```
 // Clean up after we exit
    ShutdownHookManager.addShutdownHook { () => SparkSQLEnv.stop() }
```

but use process ret to close client side jvm
```
while (line != null) {
  if (!line.startsWith("--")) {
    if (prefix.nonEmpty) {
      prefix += '\n'
     }
    if (line.trim().endsWith(";") && !line.trim().endsWith("\\;")) {
      line = prefix + line
      ret = cli.processLine(line, true)
      prefix = ""
      currentPrompt = promptWithCurrentDB
    } else {
      prefix = prefix + line
      currentPrompt = continuedPromptWithDBSpaces
    }
  }
  line = reader.readLine(currentPrompt + "> ")
}

sessionState.close()

System.exit(ret)
}
```

```
    if (sessionState.execString != null) {
      exitCode = cli.processLine(sessionState.execString)
      System.exit(exitCode)
    }

    try {
      if (sessionState.fileName != null) {
        exitCode = cli.processFile(sessionState.fileName)
        System.exit(exitCode)
      }
    } catch {
      case e: FileNotFoundException =>
        logError(s"Could not open input file for reading. (${e.getMessage})")
        exitCode = 3
        System.exit(exitCode)
    }

```

This always cause client side exit code not consistent with AM.

IN this pr I prefer to pass the exit code to `SparkContext.stop()` method to pass a clear client side status to AM side in client mode.

In this pr, I add a new `stop` method in `SchedulerBackend`
```
def stop(exitCode: Int): Unit = stop()
```
So we don't need to implement it for all kinds of scheduler backend.
In this pr, we only handle the case me meet for `YarnClientSchedulerBackend`, then we can only implement `stop(exitCode: Int)` for this class, then for yarn client mode, it can work now.

I think this can benefit many similar case in future.

### Why are the changes needed?
Keep client side status consistent  with AM side

### Does this PR introduce _any_ user-facing change?
With this pr, client side status will be same with am side

### How was this patch tested?
MT

Closes apache#35594 from AngersZhuuuu/SPARK-38270.

Lead-authored-by: Angerszhuuuu <[email protected]>
Co-authored-by: AngersZhuuuu <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
SandishKumarHN pushed a commit to SandishKumarHN/spark that referenced this pull request Dec 12, 2022
…tElementNames in Mima for Scala 2.13

### What changes were proposed in this pull request?
This PR is a followup of apache#35594 that recovers Mima compatibility test for Scala 2.13.

### Why are the changes needed?

To fix the Mima build broken (https://github.com/apache/spark/actions/runs/3380379538/jobs/5613108397)

```
[error] spark-core: Failed binary compatibility check against org.apache.spark:spark-core_2.13:3.3.0! Found 2 potential problems (filtered 945)
[error]  * method productElementName(Int)java.lang.String in object org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementName")
[error]  * method productElementNames()scala.collection.Iterator in object org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementNames")
```

### Does this PR introduce _any_ user-facing change?
No, dev-only.

### How was this patch tested?
CI in this PR should test it out. After that, scheduled jobs for Scala 2.13 will test this out

Closes apache#38492 from HyukjinKwon/SPARK-38270-followup.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants