Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -244,6 +244,10 @@ private boolean requiresRewrite(Filter filter, Schema schema, Set<Integer> ident

@Override
public void deleteWhere(Filter[] filters) {
Preconditions.checkArgument(
snapshotId == null,
"Cannot delete from table at a specific snapshot: %s", snapshotId);

Comment on lines +247 to +250
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please remove this same check from canDeleteWhere above? That way, the check is present in only one place, the place where it is actually called.
IIUC, Spark 3.0 doesn't actually call SparkTable#canDeleteWhere (Spark 3.1 does); we have canDeleteWhere in the code here because it was added when we had a single code base for both Spark 3.0 and 3.1.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that the implementation of canDeleteWhere matters much for Spark 3.0. We could simply remove the function since it is unused, but it doesn't seem worth letting the code drift either to remove it or to modify it to have this check in just one place. So I guess I disagree with this suggestion. Let's just leave it as is.

Expression deleteExpr = SparkFilters.convert(filters);

if (deleteExpr == Expressions.alwaysFalse()) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ public void testDeleteFromUnpartitionedTable() throws NoSuchTableException {
0L, scalarSql("SELECT count(1) FROM %s", tableName));
}

@Test
public void testDeleteFromTableAtSnapshot() throws NoSuchTableException {
Assume.assumeFalse(
"Spark session catalog does not support extended table names",
Expand Down