diff --git a/mmv1/third_party/terraform/website/docs/r/bigquery_table.html.markdown b/mmv1/third_party/terraform/website/docs/r/bigquery_table.html.markdown index 2b4fec268537..0e277c0f1d40 100644 --- a/mmv1/third_party/terraform/website/docs/r/bigquery_table.html.markdown +++ b/mmv1/third_party/terraform/website/docs/r/bigquery_table.html.markdown @@ -14,7 +14,6 @@ Creates a table resource in a dataset for Google BigQuery. For more information (and run `terraform apply` to write the field to state) in order to destroy an instance. It is recommended to not set this field (or set it to true) until you're ready to destroy. - ## Example Usage ```hcl @@ -115,7 +114,7 @@ The following arguments are supported: * `labels` - (Optional) A mapping of labels to assign to the resource. -* `schema` - (Optional) A JSON schema for the table. +* `schema` - (Optional) A JSON schema for the table. ~>**NOTE:** Because this field expects a JSON string, any changes to the string will create a diff, even if the JSON itself hasn't changed. @@ -124,8 +123,11 @@ The following arguments are supported: field type, we currently cannot suppress the recurring diff this causes. As a workaround, we recommend using the schema as returned by the API. - ~>**NOTE:** When setting `schema` for `external_data_configuration`, please use - `external_data_configuration.schema` [documented below](#nested_external_data_configuration). + ~>**NOTE:** If you use `external_data_configuration` + [documented below](#nested_external_data_configuration) and do **not** set + `external_data_configuration.connection_id`, schemas must be specified + with `external_data_configuration.schema`. Otherwise, schemas must be + specified with this top-level field. * `time_partitioning` - (Optional) If specified, configures time-based partitioning for this table. Structure is [documented below](#nested_time_partitioning). @@ -159,6 +161,10 @@ in Terraform state, a `terraform destroy` or `terraform apply` that would delete the form `{{project}}.{{location}}.{{connection_id}}` or `projects/{{project}}/locations/{{location}}/connections/{{connection_id}}`. + ~>**NOTE:** If you set `external_data_configuration.connection_id`, the + table schema must be specified using the top-level `schema` field + [documented above](#schema). + * `csv_options` (Optional) - Additional properties to set if `source_format` is set to "CSV". Structure is [documented below](#nested_csv_options). @@ -177,7 +183,7 @@ in Terraform state, a `terraform destroy` or `terraform apply` that would delete partitioning on an unsupported format will lead to an error, as will providing an invalid specification. Structure is [documented below](#nested_hive_partitioning_options). -* `avro_options` (Optional) - Additional options if `source_format` is set to +* `avro_options` (Optional) - Additional options if `source_format` is set to "AVRO". Structure is [documented below](#nested_avro_options). * `ignore_unknown_values` (Optional) - Indicates if BigQuery should @@ -202,6 +208,10 @@ in Terraform state, a `terraform destroy` or `terraform apply` that would delete datasource, after creation the computed schema will be stored in `google_bigquery_table.schema` + ~>**NOTE:** If you set `external_data_configuration.connection_id`, the + table schema must be specified using the top-level `schema` field + [documented above](#schema). + * `source_format` (Optional) - The data format. Please see sourceFormat under [ExternalDataConfiguration](https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#externaldataconfiguration) in Bigquery's public API documentation for supported formats. To use "GOOGLE_SHEETS" @@ -286,8 +296,8 @@ in Terraform state, a `terraform destroy` or `terraform apply` that would delete The `avro_options` block supports: -* `use_avro_logical_types` (Optional) - If is set to true, indicates whether - to interpret logical types as the corresponding BigQuery data type +* `use_avro_logical_types` (Optional) - If is set to true, indicates whether + to interpret logical types as the corresponding BigQuery data type (for example, TIMESTAMP), instead of using the raw type (for example, INTEGER). The `parquet_options` block supports: