-
Notifications
You must be signed in to change notification settings - Fork 245
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disable test_read_hive_fixed_length_char
on Spark 3.4+.
#8325
Conversation
Fixes NVIDIA#8321. This commit disables `test_read_hive_fixed_length_char` for Spark 3.4 until NVIDIA#8324 is resolved (i.e. the change in behaviour of `CHAR` columns is addressed). Signed-off-by: MithunR <[email protected]>
Build |
so to be clear, by disabling the test, we fallback to the CPU here? want to ensure since #8324 is marked low priority. If that is the case should we have test to verify it falls back? Maybe not required just asking |
It's not a fallback, exactly. On 3.4, Spark adds a code-gen step to modify the results (i.e. pad the result column out to the required width). That step causes the
I don't know if it's worth adding a test verifying that |
ok main thing is it falls back and doesn't fail, I'm fine with skipping test since we have the issue to track. |
Actually, you've convinced me. I've added an equivalent test on 3.4, allowing for You're right. It would be best to codify that that the GPU results match the CPU results, even when |
Build |
Build |
Fixes #8321.
This commit disables
test_read_hive_fixed_length_char
for Spark 3.4 until #8324 is resolved (i.e. the change in behaviour ofCHAR
columns is addressed).