Skip to content
Closed
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions R/pkg/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Package: SparkR
Type: Package
Title: R Frontend for Apache Spark
Version: 2.0.0
Date: 2016-07-07
Date: 2016-08-27
Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
email = "[email protected]"),
person("Xiangrui", "Meng", role = "aut",
Expand All @@ -11,7 +11,7 @@ Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
email = "[email protected]"),
person(family = "The Apache Software Foundation", role = c("aut", "cph")))
URL: http://www.apache.org/ http://spark.apache.org/
BugReports: https://issues.apache.org/jira/secure/CreateIssueDetails!init.jspa?pid=12315420&components=12325400&issuetype=4
BugReports: http://issues.apache.org/jira/browse/SPARK
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps that's too long but that opens directly to a form with the component field preset to SparkR - otherwise it seems easy to get lost?
Also BugReports is supposed to be forbug.report(package = "SparkR")

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Other than being long that link also doesn't seem to work if you are not logged in (would be good if you can also check this). The other thing we could do is to just link to the wiki page at https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-ContributingBugReports -- Do you think that is better ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, I thought I tested it logged out. You are right, let's scratch that then.

I like the idea with the wiki - though should that mention when to check with user@spark, when to email dev@spark and when to open a JIRA?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the wiki - Let me know if it looks better or if you have other suggestions.

Depends:
R (>= 3.0),
methods
Expand Down Expand Up @@ -39,6 +39,7 @@ Collate:
'deserialize.R'
'functions.R'
'install.R'
'jvm.R'
'mllib.R'
'serialize.R'
'sparkR.R'
Expand Down
4 changes: 4 additions & 0 deletions R/pkg/NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -363,4 +363,8 @@ S3method(structField, jobj)
S3method(structType, jobj)
S3method(structType, structField)

export("sparkR.newJObject")
export("sparkR.callJMethod")
export("sparkR.callJStatic")

export("install.spark")
93 changes: 93 additions & 0 deletions R/pkg/R/jvm.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Methods to directly access the JVM running the SparkR backend.

#' Call Java Methods
#'
#' Call a Java method in the JVM running the Spark driver. The return
#' values are automatically converted to R objects for simple objects. Other
#' values are returned as "jobj" which are references to objects on JVM.
#'
#' @param x object to invoke the method on. Should be a "jobj" created by newJObject.
#' @param methodName method name to call.
#' @param ... parameters to pass to the Java method.
#' @return the return value of the Java method. Either returned as a R object
#' if it can be deserialized or returned as a "jobj".
#' @export
#' @seealso \link{sparkR.callJStatic}, \link{sparkR.newJObject}
#' @examples
#' \dontrun{
#' sparkR.session() # Need to have a Spark JVM running before calling newJObject
#' # Create a Java ArrayList and populate it
#' jarray <- sparkR.newJObject("java.util.ArrayList")
#' sparkR.callJMethod(jarray, "add", 42L)
#' sparkR.callJMethod(jarray, "get", 0L) # Will print 42
#' }
#' @note sparkR.callJMethod since 2.0.1
sparkR.callJMethod <- function(x, methodName, ...) {
callJMethod(x, methodName, ...)
}

#' Call Static Java Methods
#'
#' Call a static method in the JVM running the Spark driver. The return
#' value is automatically converted to R objects for simple objects. Other
#' values are returned as "jobj" which are references to objects on JVM.
#'
#' @param x fully qualified Java class name that contains the static method to invoke.
#' @param methodName name of static method to invoke.
#' @param ... parameters to pass to the Java method.
#' @return the return value of the Java method. Either returned as a R object
#' if it can be deserialized or returned as a "jobj".
#' @export
#' @seealso \link{sparkR.callJMethod}, \link{sparkR.newJObject}
#' @examples
#' \dontrun{
#' sparkR.session() # Need to have a Spark JVM running before calling callJStatic
#' sparkR.callJStatic("java.lang.System", "currentTimeMillis")
#' sparkR.callJStatic("java.lang.System", "getProperty", "java.home")
#' }
#' @note sparkR.callJStatic since 2.0.1
sparkR.callJStatic <- function(x, methodName, ...) {
callJStatic(x, methodName, ...)
}

#' Create Java Objects
#'
#' Create a new Java object in the JVM running the Spark driver. The return
#' value is automatically converted to an R object for simple objects. Other
#' values are returned as a "jobj" which is a reference to an object on JVM.
#'
#' @param x fully qualified Java class name.
#' @param ... arguments to be passed to the constructor.
#' @return the object created. Either returned as a R object
#' if it can be deserialized or returned as a "jobj".
#' @export
#' @seealso \link{sparkR.callJMethod}, \link{sparkR.callJStatic}
#' @examples
#' \dontrun{
#' sparkR.session() # Need to have a Spark JVM running before calling newJObject
#' # Create a Java ArrayList and populate it
#' jarray <- sparkR.newJObject("java.util.ArrayList")
#' sparkR.callJMethod(jarray, "add", 42L)
#' sparkR.callJMethod(jarray, "get", 0L) # Will print 42
#' }
#' @note sparkR.newJObject since 2.0.1
sparkR.newJObject <- function(x, ...) {
newJObject(x, ...)
}
43 changes: 43 additions & 0 deletions R/pkg/inst/tests/testthat/test_jvm_api.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

context("JVM API")

sparkSession <- sparkR.session(enableHiveSupport = FALSE)

test_that("Create and call methods on object", {
jarr <- newJObject("java.util.ArrayList")
# Add an element to the array
callJMethod(jarr, "add", 1L)
# Check if get returns the same element
expect_equal(callJMethod(jarr, "get", 0L), 1L)
})

test_that("Call static methods", {
# Convert a boolean to a string
strTrue <- callJStatic("java.lang.String", "valueOf", TRUE)
expect_equal(strTrue, "true")
})

test_that("Manually garbage collect objects", {
jarr <- newJObject("java.util.ArrayList")
cleanup.jobj(jarr)
# Using a jobj after GC should throw an error
expect_error(print(jarr), "Error in invokeJava.*")
})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add sparkR.session.stop()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


sparkR.session.stop()