-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-19387][SPARKR] Tests do not run with SparkR source package in CRAN check #16720
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -27,6 +27,9 @@ library(SparkR) | |
|
|
||
| We use default settings in which it runs in local mode. It auto downloads Spark package in the background if no previous installation is found. For more details about setup, see [Spark Session](#SetupSparkSession). | ||
|
|
||
| ```{r, include=FALSE} | ||
| SparkR:::sparkCheckInstall() | ||
|
||
| ``` | ||
| ```{r, message=FALSE, results="hide"} | ||
| sparkR.session() | ||
| ``` | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I had in mind was to combine the
sparkR.sessionand thissparkCheckInstallinto one function so its easy to remember for a new test file. Any thoughts on this ?Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand that, but as pointed out #16720 (comment), some tests don't need SparkSession, and some tests will create/stop one as needed, and to have a function that does that all just mean more complexity?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure - that sounds fine. I was looking to see if
testthathad any support for writing asetupthat gets called before each test - Doesn't look like it has thatThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, we could put it in https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah thats a great idea - Can you see if that works (unfortunately it needs manual verification) ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any luck testing this out ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry, I"m really swamped, haven't had the chance to test that out yet
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just tested this by putting SparkR:::checkInstall in run-all.R (before calling test_package) and that seems to do the trick on a custom 2.1.0 build !
@felixchueng when you get a chance can you update the PR with that ? The only thing that I'm concerned about is calling a private function from
run-all.R- We could either export this function or move some of this functionality intoinstall.spark