Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented Oct 24, 2018

What changes were proposed in this pull request?

Remove SQLContext methods deprecated in 1.4

How was this patch tested?

Existing tests.

Copy link
Member

@gatorsmile gatorsmile left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

cc @rxin

@rxin
Copy link
Contributor

rxin commented Oct 24, 2018

LGTM.

On a related note, we should probably deprecate the entire SQLContext.

@SparkQA
Copy link

SparkQA commented Oct 24, 2018

Test build #97981 has finished for PR 22815 at commit c42212e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

#' path <- "path/to/file.json"
#' df <- read.json(path)
#' df <- read.json(path, multiLine = TRUE)
#' df <- jsonFile(path)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung maybe you can check my work here. Is this the right amount of stuff to delete from SparkR when removing these deprecated SQLContext methods?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, that's part of it, I want to remove other deprecated methods also

@SparkQA
Copy link

SparkQA commented Oct 25, 2018

Test build #97988 has finished for PR 22815 at commit 98ef77e.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

BTW, should we update migration guide too?

#' path <- "path/to/file.json"
#' df <- read.json(path)
#' df <- read.json(path, multiLine = TRUE)
#' df <- jsonFile(path)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, that's part of it, I want to remove other deprecated methods also

#' Loads a Parquet file, returning the result as a SparkDataFrame.
#'
#' @param path path of file to read. A vector of multiple paths is allowed.
#' @param ... additional external data source specific named properties.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung I got this CRAN doc error on the last run:

* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... WARNING
Undocumented arguments in documentation object 'read.parquet'
  '...'

I guess this is what it wants, but not sure why it didn't come up before?

Copy link
Member

@HyukjinKwon HyukjinKwon Oct 26, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sean actually I opened a PR yesterday against your branch .. srowen#2

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops, I missed that, sorry. I'll incorporate both changes.

* @groupname Ungrouped Support functions for language integrated queries
* @since 1.0.0
*/
@deprecated("Use SparkSession instead", "3.0.0")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a deprecation notice per @rxin

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One more very late question for 2.4 -- @rxin is it even better to deprecate this right now in 2.4.0? and in Python, R?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea I wouldn't deprecate it now ... the data source API v1 still depends on it.

Actually now I think about it, we should not be deprecate SQLContext until dsv2 is stable. Otherwise we have a stable API dsv1 depending on a deprecated API.

@felixcheung
Copy link
Member

felixcheung commented Oct 25, 2018 via email

@SparkQA
Copy link

SparkQA commented Oct 25, 2018

Test build #98025 has finished for PR 22815 at commit 92dda99.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Oct 26, 2018

Test build #98065 has finished for PR 22815 at commit 8199362.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Oct 26, 2018

Test build #98100 has finished for PR 22815 at commit 10e403a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented Oct 26, 2018

Merged to master

@asfgit asfgit closed this in ca545f7 Oct 26, 2018
@srowen srowen deleted the SPARK-25821 branch October 27, 2018 19:23
jackylee-ch pushed a commit to jackylee-ch/spark that referenced this pull request Feb 18, 2019
## What changes were proposed in this pull request?

Remove SQLContext methods deprecated in 1.4

## How was this patch tested?

Existing tests.

Closes apache#22815 from srowen/SPARK-25821.

Authored-by: Sean Owen <[email protected]>
Signed-off-by: Sean Owen <[email protected]>
HyukjinKwon pushed a commit that referenced this pull request Apr 1, 2020
### What changes were proposed in this pull request?
Add back the deprecated R APIs removed by #22843 and #22815.

These APIs are

- `sparkR.init`
- `sparkRSQL.init`
- `sparkRHive.init`
- `registerTempTable`
- `createExternalTable`
- `dropTempTable`

No need to port the function such as
```r
createExternalTable <- function(x, ...) {
  dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...)
}
```
because this was for the backward compatibility when SQLContext exists before assuming from #9192,  but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at #13635.

### Why are the changes needed?
Amend Spark's Semantic Versioning Policy

### Does this PR introduce any user-facing change?
Yes
The removed R APIs are put back.

### How was this patch tested?
Add back the removed tests

Closes #28058 from huaxingao/r.

Authored-by: Huaxin Gao <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
HyukjinKwon pushed a commit that referenced this pull request Apr 1, 2020
### What changes were proposed in this pull request?
Add back the deprecated R APIs removed by #22843 and #22815.

These APIs are

- `sparkR.init`
- `sparkRSQL.init`
- `sparkRHive.init`
- `registerTempTable`
- `createExternalTable`
- `dropTempTable`

No need to port the function such as
```r
createExternalTable <- function(x, ...) {
  dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...)
}
```
because this was for the backward compatibility when SQLContext exists before assuming from #9192,  but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at #13635.

### Why are the changes needed?
Amend Spark's Semantic Versioning Policy

### Does this PR introduce any user-facing change?
Yes
The removed R APIs are put back.

### How was this patch tested?
Add back the removed tests

Closes #28058 from huaxingao/r.

Authored-by: Huaxin Gao <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
(cherry picked from commit fd0b228)
Signed-off-by: HyukjinKwon <[email protected]>
sjincho pushed a commit to sjincho/spark that referenced this pull request Apr 15, 2020
### What changes were proposed in this pull request?
Add back the deprecated R APIs removed by apache#22843 and apache#22815.

These APIs are

- `sparkR.init`
- `sparkRSQL.init`
- `sparkRHive.init`
- `registerTempTable`
- `createExternalTable`
- `dropTempTable`

No need to port the function such as
```r
createExternalTable <- function(x, ...) {
  dispatchFunc("createExternalTable(tableName, path = NULL, source = NULL, ...)", x, ...)
}
```
because this was for the backward compatibility when SQLContext exists before assuming from apache#9192,  but seems we don't need it anymore since SparkR replaced SQLContext with Spark Session at apache#13635.

### Why are the changes needed?
Amend Spark's Semantic Versioning Policy

### Does this PR introduce any user-facing change?
Yes
The removed R APIs are put back.

### How was this patch tested?
Add back the removed tests

Closes apache#28058 from huaxingao/r.

Authored-by: Huaxin Gao <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants