-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-9763][SQL] Minimize exposure of internal SQL classes. #8056
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #40259 has finished for PR 8056 at commit
|
|
cc @zsxwing for review. |
|
@rxin could you help remove |
|
Test build #40263 has finished for PR 8056 at commit
|
|
Although the SQL UI only displays the execution info now, moving the |
|
@zsxwing removed visible for testing tag. We can move UI package around later -- it's also OK if it contains non-execution stuff. Not that big of a deal... |
|
Test build #40275 has finished for PR 8056 at commit
|
|
Test build #40280 has finished for PR 8056 at commit
|
|
The failure is because ParquetIOSuite has a hard-code name. |
|
Is it intentional to keep |
|
Test build #40283 has finished for PR 8056 at commit
|
|
Test build #1419 has finished for PR 8056 at commit
|
|
Not intentional -- but not that big of a deal for them to be there since I care more about the public API visibility here. |
|
Test build #40306 has finished for PR 8056 at commit
|
|
OK I'm going to merge this since it's from a flaky test. |
|
cc @liancheng for the change also. |
There are a few changes in this pull request: 1. Moved all data sources to execution.datasources, except the public JDBC APIs. 2. In order to maintain backward compatibility from 1, added a backward compatibility translation map in data source resolution. 3. Moved ui and metric package into execution. 4. Added more documentation on some internal classes. 5. Renamed DataSourceRegister.format -> shortName. 6. Added "override" modifier on shortName. 7. Removed IntSQLMetric. Author: Reynold Xin <[email protected]> Closes #8056 from rxin/SPARK-9763 and squashes the following commits: 9df4801 [Reynold Xin] Removed hardcoded name in test cases. d9babc6 [Reynold Xin] Shorten. e484419 [Reynold Xin] Removed VisibleForTesting. 171b812 [Reynold Xin] MimaExcludes. 2041389 [Reynold Xin] Compile ... 79dda42 [Reynold Xin] Compile. 0818ba3 [Reynold Xin] Removed IntSQLMetric. c46884f [Reynold Xin] Two more fixes. f9aa88d [Reynold Xin] [SPARK-9763][SQL] Minimize exposure of internal SQL classes. (cherry picked from commit 40ed2af) Signed-off-by: Reynold Xin <[email protected]>
|
jdbcutils scala code has a typo error in schemaString function, decimal type has extra closing braces } which is causing create table with decimal fail as create statement sent to db looks as below CREATE TABLE foo (TKT_GID DECIMAL(10},0}) NOT NULL) Below is the code def schemaString(df: DataFrame, url: String): String = { |
|
@rama-mullapudi thanks. can you submit a patch to fix those? I think I only moved the stuff around. |
There are a few changes in this pull request: 1. Moved all data sources to execution.datasources, except the public JDBC APIs. 2. In order to maintain backward compatibility from 1, added a backward compatibility translation map in data source resolution. 3. Moved ui and metric package into execution. 4. Added more documentation on some internal classes. 5. Renamed DataSourceRegister.format -> shortName. 6. Added "override" modifier on shortName. 7. Removed IntSQLMetric. Author: Reynold Xin <[email protected]> Closes apache#8056 from rxin/SPARK-9763 and squashes the following commits: 9df4801 [Reynold Xin] Removed hardcoded name in test cases. d9babc6 [Reynold Xin] Shorten. e484419 [Reynold Xin] Removed VisibleForTesting. 171b812 [Reynold Xin] MimaExcludes. 2041389 [Reynold Xin] Compile ... 79dda42 [Reynold Xin] Compile. 0818ba3 [Reynold Xin] Removed IntSQLMetric. c46884f [Reynold Xin] Two more fixes. f9aa88d [Reynold Xin] [SPARK-9763][SQL] Minimize exposure of internal SQL classes.
| } | ||
|
|
||
| val dataSchema = | ||
| StructType(schema.filterNot(f => partitionColumns.contains(f.name))).asNullable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rxin Following the discussion on SPARK-9763, I am actually wondering why we convert the StructType with "asNullable" which set all the contained StructField to be Nullable. This will cause problem when one StructFiled is not allowed to be nullable, but the HadoopFsRelationProvider automatically sets it to be nullable. Is it because that all the fields in HadoopFsRelationProvider have to be nullable? Thanks!
There are a few changes in this pull request: