Skip to content

Conversation

@zhengruifeng
Copy link
Contributor

@zhengruifeng zhengruifeng commented Sep 26, 2025

What changes were proposed in this pull request?

remove some newly added tests which failed in non-ansi

see https://github.com/apache/spark/actions/runs/18025276597/job/51291154866

======================================================================
FAIL [1.563s]: test_make_timestamp_ntz (pyspark.sql.tests.test_functions.FunctionsTests.test_make_timestamp_ntz)
Comprehensive test cases for make_timestamp_ntz with various arguments and edge cases.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/test_functions.py", line 999, in test_make_timestamp_ntz
    with self.assertRaises(Exception):
AssertionError: Exception not raised

----------------------------------------------------------------------

Why are the changes needed?

to make non-ansi schedule job happy

Does this PR introduce any user-facing change?

no

How was this patch tested?

manually check

Was this patch authored or co-authored using generative AI tooling?

no

@zhengruifeng
Copy link
Contributor Author

cc @Yicong-Huang

with self.assertRaises(Exception):
F.make_timestamp_ntz(date=df_dt.date)

# Test 17: Invalid data types - should raise exception for invalid string to int cast
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about changing it to

with self.sql_conf({"spark.sql.ansi.enabled": True}):
            # Test 17: Invalid data types - should raise exception for invalid string to int cast
            with self.assertRaises(Exception):
                self.spark.range(1).select(
                    F.make_timestamp_ntz(
                        F.lit("invalid"), F.lit(5), F.lit(22), F.lit(10), F.lit(30), F.lit(0)
                    )
                ).collect()
....

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel we don't need to check this from pyspark's, it is guaranteed by spark sql.

@zhengruifeng
Copy link
Contributor Author

merged to master

@zhengruifeng zhengruifeng deleted the remove_ansi_test branch September 26, 2025 12:00
Copy link
Contributor

@Yicong-Huang Yicong-Huang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense!

huangxiaopingRD pushed a commit to huangxiaopingRD/spark that referenced this pull request Nov 25, 2025
### What changes were proposed in this pull request?
remove some newly added tests which failed in non-ansi

see https://github.com/apache/spark/actions/runs/18025276597/job/51291154866

```
======================================================================
FAIL [1.563s]: test_make_timestamp_ntz (pyspark.sql.tests.test_functions.FunctionsTests.test_make_timestamp_ntz)
Comprehensive test cases for make_timestamp_ntz with various arguments and edge cases.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/test_functions.py", line 999, in test_make_timestamp_ntz
    with self.assertRaises(Exception):
AssertionError: Exception not raised

----------------------------------------------------------------------
```

### Why are the changes needed?
to make non-ansi schedule job happy

### Does this PR introduce _any_ user-facing change?
no

### How was this patch tested?
manually check

### Was this patch authored or co-authored using generative AI tooling?
no

Closes apache#52466 from zhengruifeng/remove_ansi_test.

Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants