Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding partition info when we try to explain on partitioned table #527

Open
zhexuany opened this issue Dec 17, 2018 · 0 comments
Open

adding partition info when we try to explain on partitioned table #527

zhexuany opened this issue Dec 17, 2018 · 0 comments
Assignees

Comments

@zhexuany
Copy link
Contributor

Table schema is

| CREATE TABLE `trb7` (
  `id` int(11) DEFAULT NULL,
  `name` varchar(50) DEFAULT NULL,
  `purchased` date DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin
PARTITION BY RANGE ( year(`purchased`) - 1 ) (
  PARTITION p0 VALUES LESS THAN (1990),
  PARTITION p1 VALUES LESS THAN (1995),
  PARTITION p2 VALUES LESS THAN (2000),
  PARTITION p3 VALUES LESS THAN (2005)
)

When we want to run explain on sql like select count(*) from trb7 where purchased > date'1992-10-10' and purchased < date'1995-10-10, current output only come with key range. It is better to inlucde partition need to be scaned.

The following is current explain info:

scala> spark.sql("select count(*) from trb7 where purchased > date'1992-10-10' and purchased < date'1995-10-10'").explain
2018-12-17 16:18:48 WARN  ObjectStore:568 - Failed to get database test, returning NoSuchObjectException
2018-12-17 16:18:48 WARN  ObjectStore:568 - Failed to get database test, returning NoSuchObjectException
2018-12-17 16:18:48 WARN  ObjectStore:568 - Failed to get database test, returning NoSuchObjectException
2018-12-17 16:18:48 WARN  ObjectStore:568 - Failed to get database test, returning NoSuchObjectException
== Physical Plan ==
*(2) HashAggregate(keys=[], functions=[specialsum(count(1)#26L, LongType, 0)])
+- Exchange SinglePartition
   +- *(1) HashAggregate(keys=[], functions=[partial_specialsum(count(1)#26L, LongType, 0)])
      +- TiSpark CoprocessorRDD{[table: trb7] , Columns: [purchased], Residual Filter: Not(IsNull([purchased])), [[purchased] LESS_THAN 1995-10-10T00:00:00.000+08:00], [[purchased] GREATER_THAN 1992-10-10T00:00:00.000+08:00], KeyRange: [[116,128,0,0,0,0,0,0,32,95,114,0,0,0,0,0,0,0,0], [116,128,0,0,0,0,0,0,32,95,115,0,0,0,0,0,0,0,0])[[116,128,0,0,0,0,0,0,33,95,114,0,0,0,0,0,0,0,0], [116,128,0,0,0,0,0,0,33,95,115,0,0,0,0,0,0,0,0]), Aggregates: Count(1)}
@zhexuany zhexuany self-assigned this Dec 17, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant