Skip to content

Conversation

@MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Sep 10, 2024

What changes were proposed in this pull request?

In the PR, I propose new sub-condition SECONDS_FRACTION of the error condition INVALID_DATETIME_PATTERN in case when datetime pattern doesn't contain proper seconds fraction of variable length.

Why are the changes needed?

To fix the failure on internal assert. This change should improve user experience with Spark SQL.

Before the changes, Spark fails while parsing the datetime patterns like \nSSSS\r:

java.lang.AssertionError: assertion failed
	at scala.Predef$.assert(Predef.scala:264)
	at org.apache.spark.ErrorClassesJsonReader.getMessageTemplate(ErrorClassesJSONReader.scala:91)
	at org.apache.spark.ErrorClassesJsonReader.getErrorMessage(ErrorClassesJSONReader.scala:46)

Does this PR introduce any user-facing change?

Should not. Only if user's code depends on the format of error message.

After changes, users get the error:

org.apache.spark.SparkIllegalArgumentException: [INVALID_DATETIME_PATTERN.SECONDS_FRACTION] Unrecognized datetime pattern:
. Cannot detect a seconds fraction pattern of variable length. Please make sure the pattern contains 'S', and does not contain illegal characters. SQLSTATE: 22007

How was this patch tested?

By running new test suites:

$ build/sbt "test:testOnly *org.apache.spark.sql.catalyst.util.DateTimeFormatterHelperSuite"

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the SQL label Sep 10, 2024
@MaxGekk MaxGekk changed the title [WIP][SPARK-49583][SQL] Define the error sub-condition SECONDS_FRACTION for invalid seconds fraction pattern [SPARK-49583][SQL] Define the error sub-condition SECONDS_FRACTION for invalid seconds fraction pattern Sep 10, 2024
@MaxGekk MaxGekk marked this pull request as ready for review September 10, 2024 09:50
@MaxGekk
Copy link
Member Author

MaxGekk commented Sep 10, 2024

Merging to master. Thank you, @cloud-fan for review.

@MaxGekk MaxGekk closed this in 7355864 Sep 10, 2024
MaxGekk added a commit that referenced this pull request Sep 10, 2024
…Class` in `SparkThrowableSuite` and in `DateTimeFormatterHelperSuite`

### What changes were proposed in this pull request?
In the PR, I propose to use `condition` instead of `errorClass` in two test suites:
- SparkThrowableSuite
- DateTimeFormatterHelperSuite

### Why are the changes needed?
Because the changes from the PR #48027 conflict to #48058 and #48026, and tests in #48027 were passed earlier than the last PRs were merged to master.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
By compiling and running the following tests locally:
```
$ build/sbt "test:testOnly *org.apache.spark.sql.catalyst.util.DateTimeFormatterHelperSuite"
$ build/sbt "test:testOnly *SparkThrowableSuite"
```

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes #48061 from MaxGekk/fix-missing-errorClass.

Authored-by: Max Gekk <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants