Skip to content

Conversation

@singhpk234
Copy link
Contributor

At present AS OF grammar is not supported in SPARK-SQL (was added in Spark 3.3 via SPARK-37219).

On digging more in the code found that there was a work-around adopted by the community until this grammar is supported in SPARK itself, by using at_timestamp_, snapshot_id_

Matcher at = AT_TIMESTAMP.matcher(ident.name());
if (at.matches()) {
long asOfTimestamp = Long.parseLong(at.group(1));
return Pair.of(table, SnapshotUtil.snapshotIdAsOfTime(table, asOfTimestamp));
}
Matcher id = SNAPSHOT_ID.matcher(ident.name());
if (id.matches()) {
long snapshotId = Long.parseLong(id.group(1));
return Pair.of(table, snapshotId);
}

Presently the docs stated Time travel is not yet supported by Spark's SQL syntax. which can be worked around with the functionalities present. This PR attempts to document that.


cc @rdblue @jackye1995 @RussellSpitzer

@github-actions github-actions bot added the docs label May 5, 2022
@singhpk234
Copy link
Contributor Author

Looks like this was intended for internal use and we don't want the users to actually use it.

refering : #3269 (comment)

@singhpk234 singhpk234 closed this May 5, 2022
@puchengy
Copy link
Contributor

puchengy commented Jun 16, 2022

@singhpk234 Hi I found this issue helpful as I want to use time travel in SQL as well.

Can you explain this a little bit?

Looks like this was intended for internal use and we don't want the users to actually use it. refering : #3269 (comment)

as I don't see the connection between your comment ("don't want user to actually use it") and the referred comments you use.

Do you mean this workaround is not guaranteed to be supported and might be dropped in the future?

@singhpk234
Copy link
Contributor Author

singhpk234 commented Jun 16, 2022

@puchengy, Hello, I was referring to this subpart of #3269 (comment) where @rdblue suggested :

We don't have to document that we support time travel through table identifiers so that we can remove it later and replace it with the AS OF syntax

Hence I closed this PR considering it not being documented is intentional.

Do you mean this workaround is not guaranteed to be supported and might be dropped in the future?

As per my understanding of the comment yes, hence not documenting this could avoid users to make dependency on it.

Now AS OF Syntax is present since 3.3, Though I am not sure of the plans of spark community to backport it back to lower spark versions like 3.0, 3.1, 3.2. Also not sure if it will be dropped from iceberg once they are supported in lower version or in 3.3. Maybe the authors / reviewers of the pr's have deeper insights on this.

@puchengy
Copy link
Contributor

puchengy commented Jun 16, 2022

@singhpk234 got it, thanks for your response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants