Skip to content

Commit

Permalink
ci: add lychee links checker with cache (#14100)
Browse files Browse the repository at this point in the history
* ci: add lychee links checker with cache

Signed-off-by: Chojan Shang <[email protected]>

* chore: fix links

Signed-off-by: Chojan Shang <[email protected]>

* chore: fix fmt

Signed-off-by: Chojan Shang <[email protected]>

---------

Signed-off-by: Chojan Shang <[email protected]>
  • Loading branch information
PsiACE authored Dec 21, 2023
1 parent 9a58e79 commit 6b67bf5
Show file tree
Hide file tree
Showing 7 changed files with 84 additions and 45 deletions.
42 changes: 42 additions & 0 deletions .github/workflows/links.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: Links

on:
repository_dispatch:
workflow_dispatch:
schedule:
- cron: "00 18 * * *"

jobs:
linkChecker:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3

- name: Restore lychee cache
id: restore-cache
uses: actions/cache/restore@v3
with:
path: .lycheecache
key: cache-lychee-${{ github.sha }}
restore-keys: cache-lychee-

- name: Link Checker
id: lychee
uses: lycheeverse/[email protected]
with:
args: "--base . --cache --max-cache-age 1d . --exclude 'https?://twitter\\.com(?:/.*$)?$'"

- name: Save lychee cache
uses: actions/cache/save@v3
if: always()
with:
path: .lycheecache
key: ${{ steps.restore-cache.outputs.cache-primary-key }}

- name: Create Issue From File
if: env.lychee_exit_code != 0
uses: peter-evans/create-issue-from-file@v4
with:
title: Link Checker Report
content-filepath: ./lychee/out.md
labels: report, automated issue
2 changes: 1 addition & 1 deletion .github/workflows/pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ jobs:
token: ${{ github.token }}
identifier: 'pr-assistant-cla'
body: |
Pull request description must contain [CLA](https://docs.databend.com/doc/contributing/good-pr) like the following:
Pull request description must contain [CLA](https://docs.databend.com/dev/policies/cla/) like the following:
```
I hereby agree to the terms of the CLA available at: https://docs.databend.com/dev/policies/cla/
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -75,3 +75,6 @@ benchmark/clickbench/results

# z3
**/.z3-trace

# lychee
.lycheecache
72 changes: 36 additions & 36 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
<div align="center">

<h4 align="center">
<a href="https://docs.databend.com/doc/cloud">Databend Serverless Cloud (beta)</a> |
<a href="https://docs.databend.com/doc">Documentation</a> |
<a href="https://docs.databend.com/guides/cloud">Databend Serverless Cloud (beta)</a> |
<a href="https://docs.databend.com/">Documentation</a> |
<a href="https://benchmark.clickhouse.com/">Benchmarking</a> |
<a href="https://github.com/datafuselabs/databend/issues/11868">Roadmap (v1.3)</a>

Expand Down Expand Up @@ -50,7 +50,7 @@

- **Rich Data Support**: Handles diverse data formats and types, including JSON, CSV, Parquet, ARRAY, TUPLE, MAP, and JSON.

- **AI-Enhanced Analytics**: Offers advanced analytics capabilities with integrated [AI Functions](https://docs.databend.com/doc/sql-functions/ai-functions/).
- **AI-Enhanced Analytics**: Offers advanced analytics capabilities with integrated [AI Functions](https://docs.databend.com/sql/sql-functions/ai-functions/).

- **Community-Driven**: Benefit from a friendly, growing community that offers an easy-to-use platform for all your cloud analytics.

Expand Down Expand Up @@ -82,58 +82,58 @@ docker run --net=host datafuselabs/databend
<details>
<summary>Deploying Databend</summary>

- [Understanding Deployment Modes](https://docs.databend.com/doc/deploy/understanding-deployment-modes)
- [Deploying a Standalone Databend](https://docs.databend.com/doc/deploy/deploying-databend)
- [Expanding a Standalone Databend](https://docs.databend.com/doc/deploy/expanding-to-a-databend-cluster)
- [Databend Cloud (Beta)](https://docs.databend.com/cloud)
- [Understanding Deployment Modes](https://docs.databend.com/guides/deploy/understanding-deployment-modes)
- [Deploying a Standalone Databend](https://docs.databend.com/guides/deploy/deploying-databend)
- [Expanding a Standalone Databend](https://docs.databend.com/guides/deploy/expanding-to-a-databend-cluster)
- [Databend Cloud (Beta)](https://docs.databend.com/guides/cloud)
</details>

<details>
<summary>Connecting to Databend</summary>

- [Connecting to Databend with BendSQL](https://docs.databend.com/doc/sql-clients/bendsql)
- [Connecting to Databend with JDBC](https://docs.databend.com/doc/sql-clients/jdbc)
- [Connecting to Databend with MySQL-Compatible Clients](https://docs.databend.com/doc/sql-clients/mysql)
- [Connecting to Databend with BendSQL](https://docs.databend.com/guides/sql-clients/bendsql)
- [Connecting to Databend with JDBC](https://docs.databend.com/guides/sql-clients/jdbc)
- [Connecting to Databend with MySQL-Compatible Clients](https://docs.databend.com/guides/sql-clients/mysql)

</details>

<details>
<summary>Loading Data into Databend</summary>

- [How to Load Data from Local File](https://docs.databend.com/doc/load-data/load/local)
- [How to Load Data from Bucket](https://docs.databend.com/doc/load-data/load/s3)
- [How to Load Data from Stage](https://docs.databend.com/doc/load-data/load/stage)
- [How to Load Data from Remote Files](https://docs.databend.com/doc/load-data/load/http)
- [Querying Data in Staged Files](https://docs.databend.com/doc/load-data/transform/querying-stage)
- [Transforming Data During a Load](https://docs.databend.com/doc/load-data/transform/data-load-transform)
- [How to Unload Data from Databend](https://docs.databend.com/doc/load-data/unload)
- [How to Load Data from Local File](https://docs.databend.com/guides/load-data/load/local)
- [How to Load Data from Bucket](https://docs.databend.com/guides/load-data/load/s3)
- [How to Load Data from Stage](https://docs.databend.com/guides/load-data/load/stage)
- [How to Load Data from Remote Files](https://docs.databend.com/guides/load-data/load/http)
- [Querying Data in Staged Files](https://docs.databend.com/guides/load-data/transform/querying-stage)
- [Transforming Data During a Load](https://docs.databend.com/guides/load-data/transform/data-load-transform)
- [How to Unload Data from Databend](https://docs.databend.com/guides/unload-data/)

</details>

<details>
<summary>Loading Data Tools with Databend</summary>

- [Apache Kafka](https://docs.databend.com/doc/load-data/load-db/kafka)
- [Airbyte](https://docs.databend.com/doc/load-data/load-db/airbyte)
- [dbt](https://docs.databend.com/doc/load-data/load-db/dbt)
- [Debezium](https://docs.databend.com/doc/load-data/load-db/debezium)
- [Apache Flink CDC](https://docs.databend.com/doc/load-data/load-db/flink-cdc)
- [DataDog Vector](https://docs.databend.com/doc/load-data/load-db/vector)
- [Addax](https://docs.databend.com/doc/load-data/load-db/addax)
- [DataX](https://docs.databend.com/doc/load-data/load-db/datax)
- [Apache Kafka](https://docs.databend.com/guides/load-data/load-db/kafka)
- [Airbyte](https://docs.databend.com/guides/load-data/load-db/airbyte)
- [dbt](https://docs.databend.com/guides/load-data/load-db/dbt)
- [Debezium](https://docs.databend.com/guides/load-data/load-db/debezium)
- [Apache Flink CDC](https://docs.databend.com/guides/load-data/load-db/flink-cdc)
- [DataDog Vector](https://docs.databend.com/guides/load-data/load-db/vector)
- [Addax](https://docs.databend.com/guides/load-data/load-db/addax)
- [DataX](https://docs.databend.com/guides/load-data/load-db/datax)

</details>

<details>
<summary>Visualize Tools with Databend</summary>

- [Metabase](https://docs.databend.com/doc/visualize/metabase)
- [Tableau](https://docs.databend.com/doc/visualize/tableau)
- [Grafana](https://docs.databend.com/doc/visualize/grafana)
- [Jupyter Notebook](https://docs.databend.com/doc/visualize/jupyter)
- [Deepnote](https://docs.databend.com/doc/visualize/deepnote)
- [MindsDB](https://docs.databend.com/doc/visualize/mindsdb)
- [Redash](https://docs.databend.com/doc/visualize/redash)
- [Metabase](https://docs.databend.com/guides/visualize/metabase)
- [Tableau](https://docs.databend.com/guides/visualize/tableau)
- [Grafana](https://docs.databend.com/guides/visualize/grafana)
- [Jupyter Notebook](https://docs.databend.com/guides/visualize/jupyter)
- [Deepnote](https://docs.databend.com/guides/visualize/deepnote)
- [MindsDB](https://docs.databend.com/guides/visualize/mindsdb)
- [Redash](https://docs.databend.com/guides/visualize/redash)

</details>

Expand Down Expand Up @@ -226,8 +226,8 @@ Databend thrives on community contributions! Whether it's through ideas, code, o

Here are some resources to help you get started:

- [Building Databend From Source](https://docs.databend.com/doc/overview/community/contributor/building-from-source)
- [The First Good Pull Request](https://docs.databend.com/doc/overview/community/contributor/good-pr)
- [Building Databend From Source](https://docs.databend.com/guides/overview/community/contributor/building-from-source)
- [The First Good Pull Request](https://docs.databend.com/guides/overview/community/contributor/good-pr)


## 👥 Community
Expand All @@ -236,7 +236,7 @@ For guidance on using Databend, we recommend starting with the official document

- [Slack](https://link.databend.rs/join-slack) (For live discussion with the Community)
- [GitHub](https://github.com/datafuselabs/databend) (Feature/Bug reports, Contributions)
- [Twitter](https://twitter.com/DatabendLabs) (Get the news fast)
- [Twitter](https://twitter.com/DatabendLabs/) (Get the news fast)
- [I'm feeling lucky](https://link.databend.rs/i-m-feeling-lucky) (Pick up a good first issue now!)


Expand All @@ -258,7 +258,7 @@ Databend is released under a combination of two licenses: the [Apache License 2.

When contributing to Databend, you can find the relevant license header in each file.

For more information, see the [LICENSE](LICENSE) file and [Licensing FAQs](https://docs.databend.com/doc/enterprise/license).
For more information, see the [LICENSE](LICENSE) file and [Licensing FAQs](https://docs.databend.com/guides/overview/editions/dee/license).


## 🙏 Acknowledgement
Expand Down
2 changes: 1 addition & 1 deletion benchmark/tpch/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,4 +33,4 @@ databend-sqllogictests --handlers mysql --database tpch --run_dir tpch --bench

## More

[Benchmarking Databend using TPC-H](https://databend.rs/blog/2022/08/08/benchmark-tpc-h)
[Benchmarking Databend using TPC-H](https://www.databend.com/blog/2022/08/08/benchmark-tpc-h)
2 changes: 0 additions & 2 deletions src/common/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,11 @@
- [`base`](./base/) contains runtime, pool, allocator and rangemap.
- [`building`](./building/) sets up the environment for building components and internal use.
- [`cache`](./cache/) contains cache traits designed for memory and disk, and provides a basic LRU implementation.
- [`contexts`](./contexts/) is the context of the data access layer.
- [`exception`](./exception/), error handling and backtracking.
- [`grpc`](./grpc/) wraps some of the utility code snippets for grpc.
- [`hashtable`](./hashtable/), a linear probe hashtable, mainly used in scenarios such as `group by` aggregation functions and `join`.
- [`http`](./http/) is a common http handler that includes health check, cpu/memory profile and graceful shutdown.
- [`io`](./io/) focus on binary serialisation and deserialisation.
- [`macros`](./macros/) are some of the procedural macros used with `common_base::base::Runtime`
- [`metrics`](./metrics/) takes over the initialization of the `PrometheusRecorder` and owns the `PrometheusHandle`.
- [`storage`](./storage/) provides storage related types and functions.
- [`tracing`](./tracing/) handles logging and tracing.
6 changes: 1 addition & 5 deletions src/query/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,15 @@ Databend Query is a Distributed Query Engine at scale.
- [`catalog`](./catalog/) contains structures and traits for catalogs management, `Catalog`, `Database`, `Table` and `TableContext`.
- [`codegen`](./codegen/) is used to generate the arithmetic result type.
- [`config`](./config/) provides config support for databend query.
- [`datablocks`](./datablocks/), `Vec` collection, which encapsulates some common methods, and will be gradually migrated to `expressions`.
- [`datavalues`](./datavalues/), the definition of each type of Column, which represents the layout of data in memory, will be gradually migrated to `expressions`.
- [`expression`](./expression/), the new scalar expression framework with expression definition (AST), type checking, and evaluation runtime.
- [`formats`](./formats/), the serialization and deserialization of data in various formats to the outside.
- [`functions`](./functions/), scalar functions and aggregate functions, etc., will be gradually migrated to `functions-v2`.
- [`functions-v2`](./functions-v2/), scalar functions and aggregate functions, etc., based on `expression`.
- [`functions`](./functions/), scalar functions and aggregate functions, etc.
- [`management`](./management/) for clusters, quotas, etc.
- [`pipeline`](./pipeline/) implements the scheduling framework for physical operators.
- [`planners`](./planners/) builds an execution plan from the user's SQL statement and represents the query with different types of relational operators.
- [`service`](./service/) -> `databend-query`, the query service library of Databend.
- [`settings`](./settings/), global and session level settings.
- [`storages`](./storages/) relates to table engines, including the commonly used fuse engine and indexes etc.
- [`streams`](./streams/) contains data sources and streams.
- [`users`](./users/), role-based access and control.
- [`ee`](ee/) contains enterprise functionalities.
- [`ee-features`](ee_features/) contains enterprise features.

0 comments on commit 6b67bf5

Please sign in to comment.