Skip to content

Conversation

@gengliangwang
Copy link
Member

@gengliangwang gengliangwang commented Jun 11, 2020

What changes were proposed in this pull request?

This PR updates the test case to accept Hadoop 2/3 error message correctly.

Why are the changes needed?

SPARK-31935(#28760) breaks Hadoop 3.2 UT because Hadoop 2 and Hadoop 3 have different exception messages.
In #28791, there are two test suites missed the fix

Does this PR introduce any user-facing change?

No

How was this patch tested?

Unit test

spark.readStream.option("fs.defaultFS", defaultFs).text(path)
}.getMessage
assert(message == expectMessage)
assert(message.filterNot(Set(':', '"').contains) == expectMessage)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to remove : at line 539, too.

dataSource invokePrivate checkAndGlobPathIfNecessary(false, false)
}.getMessage
assert(message.equals("No FileSystem for scheme: nonexistsFs"))
val expectMessage = "No FileSystem for scheme nonexistFS"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nonexistFS -> nonexistsFs?

Copy link
Member Author

@gengliangwang gengliangwang Jun 11, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well, then I would prefer nonExistingFS..I was trying to keep the naming simple.
Let me change them all since you are asking.

Copy link
Member

@dongjoon-hyun dongjoon-hyun Jun 11, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ur, I asked this because this test case fails still.

[info] - Data source options should be propagated in method checkAndGlobPathIfNecessary *** FAILED *** (599 milliseconds)
[info]   "... for scheme nonexist[sFs]" did not equal "... for scheme nonexist[FS]" (DataSourceSuite.scala:146)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, I don't care about the naming here if it passes with -Phadoop-3.2.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see :)

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you so much for recovering Hadoop 3.2, @gengliangwang .

I tested with the followings.

build/sbt "sql/testOnly *.FileStreamSourceSuite -- -z SPARK-31935"
build/sbt "sql/testOnly *.FileStreamSourceSuite -- -z SPARK-31935" -Phadoop-3.2
build/sbt "sql/testOnly *.DataSourceSuite -- -z checkAndGlobPathIfNecessary"
build/sbt "sql/testOnly *.DataSourceSuite -- -z checkAndGlobPathIfNecessary" -Phadoop-3.2

Merged to master!

@dongjoon-hyun dongjoon-hyun changed the title [SPARK-31935][SQL][TESTS][FOLLOWUP][test-hadoop3.2] Fix the test case for Hadoop2/3 [SPARK-31935][SQL][TESTS][FOLLOWUP] Fix the test case for Hadoop2/3 Jun 11, 2020
@SparkQA
Copy link

SparkQA commented Jun 11, 2020

Test build #123814 has finished for PR 28796 at commit 58b5159.

  • This patch fails to generate documentation.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gengliangwang
Copy link
Member Author

^^^ the document generation error seems not related.

@dongjoon-hyun
Copy link
Member

Oh..

@dongjoon-hyun
Copy link
Member

Is it the same at the last commit, e3bb417?

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Jun 11, 2020

[error] /home/jenkins/workspace/SparkPullRequestBuilder@6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: type Server is not a member of object org.eclipse.jetty.util.ssl.SslContextFactory
[error]       val sslContextFactory = new SslContextFactory.Server()
[error]                                                     ^
[info] No documentation generated with unsuccessful compiler run
[error] one error found

It seems to be a different commit. I'll take a look~

@gengliangwang
Copy link
Member Author

The latest run is ongoing and I don't think the error was related https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/123818/

@dongjoon-hyun
Copy link
Member

Got it. Thanks, @gengliangwang .

gengliangwang added a commit to gengliangwang/spark that referenced this pull request Jun 11, 2020
This PR updates the test case to accept Hadoop 2/3 error message correctly.

SPARK-31935(apache#28760) breaks Hadoop 3.2 UT because Hadoop 2 and Hadoop 3 have different exception messages.
In apache#28791, there are two test suites missed the fix

No

Unit test

Closes apache#28796 from gengliangwang/SPARK-31926-followup.

Authored-by: Gengliang Wang <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
@SparkQA
Copy link

SparkQA commented Jun 11, 2020

Test build #123818 has finished for PR 28796 at commit e3bb417.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 11, 2020

Test build #123813 has finished for PR 28796 at commit 036864b.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants