Skip to content

Conversation

@yaooqinn
Copy link
Member

What changes were proposed in this pull request?

fix the overdue and incorrect description about CalendarIntervalType

Why are the changes needed?

api doc correctness

Does this PR introduce any user-facing change?

no

How was this patch tested?

no

@yaooqinn
Copy link
Member Author

cc @cloud-fan, please take a look at this, I believe this is trivial but important.

@SparkQA
Copy link

SparkQA commented Nov 25, 2019

Test build #114383 has finished for PR 26659 at commit 5c2f5e3.

  • This patch fails to generate documentation.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 25, 2019

Test build #114397 has finished for PR 26659 at commit ac613bb.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think ColumnVector has a similar comment that needs updating.
CC @MaxGekk

@cloud-fan
Copy link
Contributor

ColumnVector.getInterval has already been updated when we add the days field. I think this is the only one left.

Copy link
Member

@MaxGekk MaxGekk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we say what is the range of valid values. DateType and TimestampType have min and max valid points. Can the INTERVAL type represents interval greater than max-min timestamp?

import org.apache.spark.unsafe.types.CalendarInterval

/**
* The data type representing calendar time intervals. The calendar time interval is stored
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

representing -> represents? @srowen Does representing sound good for you?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

representing sounds right. "[This class is] The data type representing ..."

* The data type representing calendar time intervals. The calendar time interval is stored
* internally in two components: number of months the number of microseconds.
* internally in three components:
* the number of months
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should say explicitly that the numbers can be positive as well as negative

import org.apache.spark.unsafe.types.CalendarInterval

/**
* The data type representing calendar time intervals. The calendar time interval is stored
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

calendar time intervals . It is arguable but it can represent date interval too.

@SparkQA
Copy link

SparkQA commented Nov 25, 2019

Test build #114413 has finished for PR 26659 at commit 6a95931.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan cloud-fan closed this in 8b0121b Nov 26, 2019
@cloud-fan
Copy link
Contributor

thanks, merging to master!

cloud-fan pushed a commit that referenced this pull request Feb 22, 2021
…lity

### What changes were proposed in this pull request?
In the PR, I propose to revert #26659 partially regarding to comparability of interval values. The comment became incorrect after #27262.

### Why are the changes needed?
The comment is incorrect, and it might confuse Spark's devs/users.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
By checking scala coding style `./dev/scalastyle`.

Closes #31610 from MaxGekk/doc-interval-not-comparable.

Authored-by: Max Gekk <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
cloud-fan pushed a commit that referenced this pull request Feb 22, 2021
…lity

### What changes were proposed in this pull request?
In the PR, I propose to revert #26659 partially regarding to comparability of interval values. The comment became incorrect after #27262.

### Why are the changes needed?
The comment is incorrect, and it might confuse Spark's devs/users.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
By checking scala coding style `./dev/scalastyle`.

Closes #31610 from MaxGekk/doc-interval-not-comparable.

Authored-by: Max Gekk <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 7df4fed)
Signed-off-by: Wenchen Fan <[email protected]>
a0x8o added a commit to a0x8o/spark that referenced this pull request Feb 22, 2021
…lity

### What changes were proposed in this pull request?
In the PR, I propose to revert apache/spark#26659 partially regarding to comparability of interval values. The comment became incorrect after apache/spark#27262.

### Why are the changes needed?
The comment is incorrect, and it might confuse Spark's devs/users.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
By checking scala coding style `./dev/scalastyle`.

Closes #31610 from MaxGekk/doc-interval-not-comparable.

Authored-by: Max Gekk <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants