Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add internal stream columns #13960

Merged
merged 2 commits into from
Dec 10, 2023
Merged

Conversation

zhyass
Copy link
Member

@zhyass zhyass commented Dec 8, 2023

I hereby agree to the terms of the CLA available at: https://databend.rs/dev/policies/cla/

Summary

change$action
change$is_update
change$row_id:

if(
    is_not_null(_origin_block_id), 
    concat(to_uuid(_origin_block_id), lpad(hex(_origin_block_row_num), 6, '0')), 
    _base_row_id
)
mysql> create table t(a int);
Query OK, 0 rows affected (0.05 sec)

mysql> insert into t values(1);
Query OK, 1 row affected (0.06 sec)

mysql> create stream s on table t;
Query OK, 0 rows affected (0.06 sec)

mysql> insert into t values(2);
Query OK, 1 row affected (0.05 sec)

mysql> select a, change$action, change$is_update, change$row_id from s order by a;
+------+---------------+------------------+----------------------------------------+
| a    | change$action | change$is_update | change$row_id                          |
+------+---------------+------------------+----------------------------------------+
|    2 | INSERT        |                0 | a6073e4c752749e39b7061da478e8442000000 |
+------+---------------+------------------+----------------------------------------+
1 row in set (0.12 sec)
Read 1 rows, 197.00 B in 0.030 sec., 33.52 rows/sec., 6.45 KiB/sec.

mysql> optimize table t compact;
Query OK, 2 rows affected (0.09 sec)

mysql> select a, change$action, change$is_update, change$row_id from s order by a;
+------+---------------+------------------+----------------------------------------+
| a    | change$action | change$is_update | change$row_id                          |
+------+---------------+------------------+----------------------------------------+
|    2 | INSERT        |                0 | a6073e4c752749e39b7061da478e8442000000 |
+------+---------------+------------------+----------------------------------------+
1 row in set (0.10 sec)
Read 2 rows, 76.00 B in 0.029 sec., 69.63 rows/sec., 2.58 KiB/sec.
  • Closes #issue

This change is Reviewable

@github-actions github-actions bot added the pr-feature this PR introduces a new feature to the codebase label Dec 8, 2023
@zhyass zhyass marked this pull request as draft December 8, 2023 06:09
@zhyass zhyass marked this pull request as draft December 8, 2023 06:09
@zhyass zhyass force-pushed the feat_stream branch 3 times, most recently from 24300ea to c748dd1 Compare December 8, 2023 15:13
@zhyass zhyass marked this pull request as ready for review December 8, 2023 15:30
@zhyass zhyass force-pushed the feat_stream branch 2 times, most recently from 9a580a3 to c972a34 Compare December 9, 2023 16:32
@BohuTANG BohuTANG added the ci-cloud Build docker image for cloud test label Dec 10, 2023

This comment was marked as outdated.

@zhyass zhyass added ci-cloud Build docker image for cloud test and removed ci-cloud Build docker image for cloud test labels Dec 10, 2023
Copy link
Contributor

Docker Image for PR

  • tag: pr-13960-c3be3f8

note: this image tag is only available for internal use,
please check the internal doc for more details.

@BohuTANG
Copy link
Member

stream wizard passed: https://github.com/datafuselabs/wizard/tree/main/checksb/sql/streams

@BohuTANG BohuTANG merged commit 993f03c into databendlabs:main Dec 10, 2023
71 checks passed
@BohuTANG
Copy link
Member

cc @soyeric128 to document more columns of the stream like snow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci-cloud Build docker image for cloud test pr-feature this PR introduces a new feature to the codebase
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants