Skip to content

Conversation

@felixcheung
Copy link
Member

@felixcheung felixcheung commented Jan 27, 2017

What changes were proposed in this pull request?

  • this is cause by changes in SPARK-18444, SPARK-18643 that we no longer install Spark when master = "" (default), but also related to SPARK-18449 since the real master value is not known at the time the R code in sparkR.session is run. (master cannot default to "local" since it could be overridden by spark-submit commandline or spark config)
  • as a result, while running SparkR as a package in IDE is working fine, CRAN check is not as it is launching it via non-interactive script
  • fix is to add check to the beginning of each test and vignettes; the same would also work by changing sparkR.session() to sparkR.session(master = "local") in tests, but I think being more explicit is better.

How was this patch tested?

Tested this by reverting version to 2.1, since it needs to download the release jar with matching version. But since there are changes in 2.2 (specifically around SparkR ML) that are incompatible with 2.1, some tests are failing in this config. Will need to port this to branch-2.1 and retest with 2.1 release jar.

manually as:

# modify DESCRIPTION to revert version to 2.1.0
SPARK_HOME=/usr/spark R CMD build pkg
# run cran check without SPARK_HOME 
R CMD check --as-cran SparkR_2.1.0.tar.gz

@felixcheung
Copy link
Member Author

@shivaram

@shivaram
Copy link
Contributor

Hmm - another fix could be that in the test cases whenever we create spark.session we always pass in master=local ?

@felixcheung
Copy link
Member Author

felixcheung commented Jan 27, 2017

yes, that is described in the PR description:
"
fix is to add check to the beginning of each test and vignettes; the same would also work by changing sparkR.session() to sparkR.session(master = "local") in tests, but I think being more explicit is better.
"

And beside, it is conceivable that these tests could be set to run in a cluster, in which case we could expect SPARK_HOME is set but master is not local. (with this PR that would still work properly)

@felixcheung
Copy link
Member Author

felixcheung commented Jan 27, 2017

I think better would be the approach being taken in PR 16330 - has a first run test that prepare and run these kind of global thing https://github.com/apache/spark/pull/16330/files#diff-5ff1ba5d1751f3b1cc96a567e9ab25ff

@SparkQA
Copy link

SparkQA commented Jan 27, 2017

Test build #72082 has finished for PR 16720 at commit 318ecc8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

I am not sure tests are ever meant to run on a cluster (see the number of uses of LocalSparkContext in core/src/test/scala) -- The main reason I dont want to introduce the 'first test' approach is that we are then relying too much on test names not clashing / getting in front of each other which seems fragile.

The other thing that might be good is to create a test util function like initializeTestSparkContext and inside that we put both the session start and install stuff.

@felixcheung
Copy link
Member Author

felixcheung commented Jan 28, 2017

Sure, I've simplified it.

Good point on the ordering - digging into it looks like it's just file system search order, which really is not reliable.

We could certainly add a test util - though seems like some tests are different though, for example test_context.R doesn't need a SparkSession.

@SparkQA
Copy link

SparkQA commented Jan 28, 2017

Test build #72103 has finished for PR 16720 at commit f51f504.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

We use default settings in which it runs in local mode. It auto downloads Spark package in the background if no previous installation is found. For more details about setup, see [Spark Session](#SetupSparkSession).

```{r, include=FALSE}
SparkR:::sparkCheckInstall()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it ok to include a ::: function in the vignette ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this has include=FALSE so it will run but the code and output will not be included in the vignettes text

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the Rmd file a part of the install that the users see ? I just dont want to put in any code that people might copy-paste etc. Is it not good enough to pass in master=local[*] here ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW These vignette changes are still needed even if we update run-all.R

context("functions in utils.R")

# Ensure Spark is installed
sparkCheckInstall()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I had in mind was to combine the sparkR.session and this sparkCheckInstall into one function so its easy to remember for a new test file. Any thoughts on this ?

Copy link
Member Author

@felixcheung felixcheung Feb 1, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand that, but as pointed out #16720 (comment), some tests don't need SparkSession, and some tests will create/stop one as needed, and to have a function that does that all just mean more complexity?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure - that sounds fine. I was looking to see if testthat had any support for writing a setup that gets called before each test - Doesn't look like it has that

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah thats a great idea - Can you see if that works (unfortunately it needs manual verification) ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any luck testing this out ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, I"m really swamped, haven't had the chance to test that out yet

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just tested this by putting SparkR:::checkInstall in run-all.R (before calling test_package) and that seems to do the trick on a custom 2.1.0 build !

@felixchueng when you get a chance can you update the PR with that ? The only thing that I'm concerned about is calling a private function from run-all.R - We could either export this function or move some of this functionality into install.spark

@felixcheung
Copy link
Member Author

felixcheung commented Feb 8, 2017 via email

@felixcheung
Copy link
Member Author

found another issue, opened SPARK-19568

@SparkQA
Copy link

SparkQA commented Feb 13, 2017

Test build #72793 has finished for PR 16720 at commit cd1394a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@felixcheung felixcheung reopened this Feb 14, 2017
@shivaram
Copy link
Contributor

LGTM. I patched this again on top of 2.1.0 and R CMD check --as-cran passes now. Merging this to master and branch-2.1

@asfgit asfgit closed this in a3626ca Feb 14, 2017
asfgit pushed a commit that referenced this pull request Feb 14, 2017
…CRAN check

## What changes were proposed in this pull request?

- this is cause by changes in SPARK-18444, SPARK-18643 that we no longer install Spark when `master = ""` (default), but also related to SPARK-18449 since the real `master` value is not known at the time the R code in `sparkR.session` is run. (`master` cannot default to "local" since it could be overridden by spark-submit commandline or spark config)
- as a result, while running SparkR as a package in IDE is working fine, CRAN check is not as it is launching it via non-interactive script
- fix is to add check to the beginning of each test and vignettes; the same would also work by changing `sparkR.session()` to `sparkR.session(master = "local")` in tests, but I think being more explicit is better.

## How was this patch tested?

Tested this by reverting version to 2.1, since it needs to download the release jar with matching version. But since there are changes in 2.2 (specifically around SparkR ML) that are incompatible with 2.1, some tests are failing in this config. Will need to port this to branch-2.1 and retest with 2.1 release jar.

manually as:
```
# modify DESCRIPTION to revert version to 2.1.0
SPARK_HOME=/usr/spark R CMD build pkg
# run cran check without SPARK_HOME
R CMD check --as-cran SparkR_2.1.0.tar.gz
```

Author: Felix Cheung <[email protected]>

Closes #16720 from felixcheung/rcranchecktest.

(cherry picked from commit a3626ca)
Signed-off-by: Shivaram Venkataraman <[email protected]>
cmonkey pushed a commit to cmonkey/spark that referenced this pull request Feb 15, 2017
…CRAN check

## What changes were proposed in this pull request?

- this is cause by changes in SPARK-18444, SPARK-18643 that we no longer install Spark when `master = ""` (default), but also related to SPARK-18449 since the real `master` value is not known at the time the R code in `sparkR.session` is run. (`master` cannot default to "local" since it could be overridden by spark-submit commandline or spark config)
- as a result, while running SparkR as a package in IDE is working fine, CRAN check is not as it is launching it via non-interactive script
- fix is to add check to the beginning of each test and vignettes; the same would also work by changing `sparkR.session()` to `sparkR.session(master = "local")` in tests, but I think being more explicit is better.

## How was this patch tested?

Tested this by reverting version to 2.1, since it needs to download the release jar with matching version. But since there are changes in 2.2 (specifically around SparkR ML) that are incompatible with 2.1, some tests are failing in this config. Will need to port this to branch-2.1 and retest with 2.1 release jar.

manually as:
```
# modify DESCRIPTION to revert version to 2.1.0
SPARK_HOME=/usr/spark R CMD build pkg
# run cran check without SPARK_HOME
R CMD check --as-cran SparkR_2.1.0.tar.gz
```

Author: Felix Cheung <[email protected]>

Closes apache#16720 from felixcheung/rcranchecktest.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants