-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-31290][R] Add back the deprecated R APIs #28058
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #120517 has finished for PR 28058 at commit
|
|
Do we need to add back |
|
If we have added these back to Scala, adding them to R seems good. |
|
cc @marmbrus |
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM except one question https://github.com/apache/spark/pull/28058/files#r399743329
|
They don’t have 1:1 to Scala. Some R APIs are older and long since removed even in Scala before 3.0.
I’m fine with this change if this is what the community has agreed on.
|
|
If they have already been removed prior to 3.0 and nobody has said anything, I don't think we should add those back in. |
|
Just to clarify things, what the community agreed on is the rubric added at https://spark.apache.org/versioning-policy.html. Several APIs listed here, One last thing is, SparkR dropped RDD support. So I don't think it also makes sense to add |
|
@huaxingao, seems like we roughly agreed upon excluding |
|
Test build #120563 has finished for PR 28058 at commit
|
This comment has been minimized.
This comment has been minimized.
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@huaxingao, can you also add the reasons why we didn't port:
createExternalTable <- function(x, ...) {
dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...)
}in the PR description?
Apparently, this was for the backward compatibility when SQLContext exists before assuming from #9192 but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at #13635
|
It seems more complicated than I thought. Let me try to clarify the current status here:
|
|
Test build #120567 has finished for PR 28058 at commit
|
|
I discussed offline with @marmbrus. Looks we're fine to exclude these four above. Merged to master and branch-3.0. |
### What changes were proposed in this pull request? Add back the deprecated R APIs removed by #22843 and #22815. These APIs are - `sparkR.init` - `sparkRSQL.init` - `sparkRHive.init` - `registerTempTable` - `createExternalTable` - `dropTempTable` No need to port the function such as ```r createExternalTable <- function(x, ...) { dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...) } ``` because this was for the backward compatibility when SQLContext exists before assuming from #9192, but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at #13635. ### Why are the changes needed? Amend Spark's Semantic Versioning Policy ### Does this PR introduce any user-facing change? Yes The removed R APIs are put back. ### How was this patch tested? Add back the removed tests Closes #28058 from huaxingao/r. Authored-by: Huaxin Gao <[email protected]> Signed-off-by: HyukjinKwon <[email protected]> (cherry picked from commit fd0b228) Signed-off-by: HyukjinKwon <[email protected]>
|
Thank you all for the help! |
### What changes were proposed in this pull request? Add back the deprecated R APIs removed by apache#22843 and apache#22815. These APIs are - `sparkR.init` - `sparkRSQL.init` - `sparkRHive.init` - `registerTempTable` - `createExternalTable` - `dropTempTable` No need to port the function such as ```r createExternalTable <- function(x, ...) { dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...) } ``` because this was for the backward compatibility when SQLContext exists before assuming from apache#9192, but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at apache#13635. ### Why are the changes needed? Amend Spark's Semantic Versioning Policy ### Does this PR introduce any user-facing change? Yes The removed R APIs are put back. ### How was this patch tested? Add back the removed tests Closes apache#28058 from huaxingao/r. Authored-by: Huaxin Gao <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
What changes were proposed in this pull request?
Add back the deprecated R APIs removed by #22843 and #22815.
These APIs are
sparkR.initsparkRSQL.initsparkRHive.initregisterTempTablecreateExternalTabledropTempTableNo need to port the function such as
because this was for the backward compatibility when SQLContext exists before assuming from #9192, but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at #13635.
Why are the changes needed?
Amend Spark's Semantic Versioning Policy
Does this PR introduce any user-facing change?
Yes
The removed R APIs are put back.
How was this patch tested?
Add back the removed tests