Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 8 additions & 5 deletions regtests/t_pyspark/src/test_spark_sql_s3_with_privileges.py
Original file line number Diff line number Diff line change
Expand Up @@ -1203,23 +1203,26 @@ def test_spark_credentials_s3_exception_on_metadata_file_deletion(root_client, s
assert metadata_contents['ContentLength'] > 0

# Delete metadata files
objects_to_delete = [{'Key': obj['Key']} for obj in objects['Contents']]
s3.delete_objects(Bucket=test_bucket,
Delete={'Objects': objects})
Delete={'Objects': objects_to_delete})

try:
response = snowman_catalog_client.load_table(snowflake_catalog.name, unquote('db1%1Fschema'),
"iceberg_table",
"vended-credentials")
except Exception as e:
assert '404' in str(e)
# 400 error(BadRequest) is thrown when metadata file is missing
assert '400' in str(e)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, 400 sounds more reasonable than 404 considering IRC spec says

400 :

Indicates a bad request error. It could be caused by an unexpected request body format or other forms of request validation failure, such as invalid json. Usually serves application/json content, although in some cases simple text/plain content might be returned by the server's middleware.

404 :
Not Found - NoSuchTableException, table to load does not exist

Though in this case Table does exists but the path is dropped, which to the best of my understanding qualifies for 400.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should just make the test pass, we can always change the response code later and then change the test again.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, i think @singhpk234 is actually justify here that 400 makes more sense, instead of 404



with IcebergSparkSession(credentials=f'{snowman.principal.client_id}:{snowman.credentials.client_secret.get_secret_value()}',
catalog_name=snowflake_catalog.name,
polaris_url=polaris_catalog_url) as spark:
spark.sql(f'USE {snowflake_catalog.name}')
spark.sql('USE db1.schema')
spark.sql('DROP TABLE iceberg_table PURGE')
# Spark drop table triggers load table underneath, which fails due to missing metadata file.
# Directly call drop_table api to drop the table entity.
snowman_catalog_client.drop_table(snowflake_catalog.name, unquote('db1%1Fschema'),
"iceberg_table")
spark.sql(f'USE {snowflake_catalog.name}')
spark.sql('DROP NAMESPACE db1.schema')
spark.sql('DROP NAMESPACE db1')
Expand Down
Loading