[SPARK-20197][SPARKR] CRAN check fail with package installation#17516
[SPARK-20197][SPARKR] CRAN check fail with package installation#17516felixcheung wants to merge 2 commits intoapache:masterfrom
Conversation
|
Test build #75469 has finished for PR 17516 at commit
|
|
unlike in Jenkins, R CMD check looks to be running each test R file with the current directory set to its directory (eg. spark/R/pkg/inst/tests/testthat/ for test_sparkSQL.R) when running as R CMD check, SPARK_HOME = ~/.cache/spark/spark* whereas spark-warehouse and metastore_db are actually created under R/inst/tests/testthat (where the actual test .R files are) unless there is a good way to run all tests the exact same way as R CMD check, we will need to revisit what we are checking here. |
|
Looking at this now |
|
So does the current patch pass |
|
Test build #75589 has finished for PR 17516 at commit
|
|
Don't we also need the skip if cran statement ? |
|
The test actually passes - just not looking at the right place. Skip if cran is not applicable in master since we are running tests as cran - if I add skip if cran it will just disable this test completely.
This is why I opt to originally to update the test instead instead of skip if cran in master.
But since we might be cutting a RC any time I want to fix the blocker first and get back to fix the test (which doesn't fail) when I get the chance.
I will merge this today unless you have a concern?
|
|
The test passes even when we run the |
|
There are two parts to the branch-2.1 fix First, the reason why the test failed was because Second, even after the change, while testing it, I found that And so the attempt here in this PR to fix this for real in master. Since we are rolling our RC anytime, I don't want to delay the first fix (install.spark) only to sort out the 2nd part, which could come a bit later. If you feel that's safer, we could also add |
|
Got it. LGTM. Thanks for explanation. I'm fine with merging this to master ! |
|
thanks, I find it rather odd but probably by design that the current directory is different when running |
|
merged to master |
What changes were proposed in this pull request?
Test failed because SPARK_HOME is not set before Spark is installed.