Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 6 additions & 10 deletions plugin/trino-bigquery/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# BigQuery Connector Developer Notes

The BigQuery connector module has both unit tests and integration tests.
The integration tests require access to a BigQuery instance in Google Cloud seeded with TPCH data.
The integration tests require access to a BigQuery instance in Google Cloud.
You can follow the steps below to be able to run the integration tests locally.

## Requirements
Expand All @@ -15,17 +15,13 @@ You can follow the steps below to be able to run the integration tests locally.

* [Enable BigQuery in your Google Cloud account](https://console.cloud.google.com/flows/enableapi?apiid=bigquery).
* Build the project by following the instructions [here](../../README.md).
* Run Trino with the TPCH connector installed using Docker as `docker run --rm --name trino -it -p 8080:8080
trinodb/trino:latest`.
* Run the script `plugin/trino-bigquery/bin/import-tpch-to-bigquery.sh` and pass your Google Cloud project id to it as
an argument. e.g. `plugin/trino-bigquery/bin/import-tpch-to-bigquery.sh trino-bigquery-07` where `trino-bigquery-07`
is your Google Cloud project id.
* Run `gsutil cp src/test/resources/region.csv gs://DESTINATION_BUCKET_NAME/tpch/tiny/region.csv`
* Create a Google Cloud Storage bucket using `gsutil mb gs://DESTINATION_BUCKET_NAME`
* Run `gsutil cp plugin/trino-bigquery/src/test/resources/region.csv gs://DESTINATION_BUCKET_NAME/tpch/tiny/region.csv`
(replace `DESTINATION_BUCKET_NAME` with the target bucket name).
* [Create a service account](https://cloud.google.com/docs/authentication/getting-started) in Google Cloud with the
* [Create a service account](https://cloud.google.com/iam/docs/creating-managing-service-accounts#iam-service-accounts-create-console) in Google Cloud with the
**BigQuery Admin** role assigned.
* Get the base64 encoded text of the service account credentials file using `base64
/path/to/service_account_credentials.json`.
* Set the VM option `bigquery.credentials-key` in the IntelliJ "Run Configuration" (or on the CLI if using Maven
directly). It should look something like `-Dbigquery.credentials-key=base64-text`.
* Set the VM option `bigquery.credentials-key` and `testing.gcp-storage-bucket` in the IntelliJ "Run Configuration" (or on the CLI if using Maven
directly). It should look something like `-Dbigquery.credentials-key=base64-text -Dtesting.gcp-storage-bucket=DESTINATION_BUCKET_NAME`.
* Run any test of your choice.
43 changes: 0 additions & 43 deletions plugin/trino-bigquery/bin/import-tpch-to-bigquery.sh

This file was deleted.

42 changes: 0 additions & 42 deletions plugin/trino-bigquery/bin/schema/customer.json

This file was deleted.

82 changes: 0 additions & 82 deletions plugin/trino-bigquery/bin/schema/lineitem.json

This file was deleted.

22 changes: 0 additions & 22 deletions plugin/trino-bigquery/bin/schema/nation.json

This file was deleted.

47 changes: 0 additions & 47 deletions plugin/trino-bigquery/bin/schema/orders.json

This file was deleted.

47 changes: 0 additions & 47 deletions plugin/trino-bigquery/bin/schema/part.json

This file was deleted.

27 changes: 0 additions & 27 deletions plugin/trino-bigquery/bin/schema/partsupp.json

This file was deleted.

17 changes: 0 additions & 17 deletions plugin/trino-bigquery/bin/schema/region.json

This file was deleted.

Loading