-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-43648][CONNECT][TESTS] Make interrupt all related test can be tested using maven
#41483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
An overly complex way that doesn't look good |
| val session = spark | ||
| import session.implicits._ | ||
| implicit val ec = ExecutionContext.global | ||
| assume(transferredClientTestJarIfNeed && transferredClientJarIfNeed) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because ClientE2ETestSuite has imported classes such as SparkResult in client module, these two test cases need to also add spark-connect-client-jvm.jar as an artifact.
So I think it would be better to move them from ClientE2ETestSuite to a new test class like #41487
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps we can try using SparkResult.class as an Artifact, but this may make the testing code more and more complex
| Seq("--conf", s"spark.sql.catalogImplementation=$catalogImplementation") | ||
| } | ||
|
|
||
| // For UDF maven E2E tests, the server needs the client code to find the UDFs defined in tests. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To avoid visibility issues with classloader, need to transfer both spark-connect-client-jvm.jar and spark-connect-client-jvm-tests.jar as Artifacts and remove this config
| echo "Using SPARK_LOCAL_IP=$SPARK_LOCAL_IP" 1>&2 | ||
| fi | ||
|
|
||
| export USE_MAVEN=1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only Maven will trigger this issue, so add an environment variable needs to identify Maven test
| protected lazy val serverPort: Int = port | ||
| protected lazy val transferredClientJarIfNeed: Boolean = { | ||
| if (sys.env.contains("USE_MAVEN")) { | ||
| if (RemoteSparkSession.transferredClientJar.get()) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that the same Artifact cannot be transmitted repeatedly, so add a status to ensure this
| // SBT passes the client & test jars to the server process automatically. | ||
| // So we skip building or finding this jar for SBT. | ||
| "sbt-tests-do-not-need-this-jar", | ||
| "spark-connect-client-jvm") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark-connect-client-jvm is not guaranteed to exist uuring Maven testing, so the test may also be skipped
What changes were proposed in this pull request?
Why are the changes needed?
Does this PR introduce any user-facing change?
How was this patch tested?