-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
google_bigquery_data_transfer_config, change destination_dataset_id to **Optional** #9450
google_bigquery_data_transfer_config, change destination_dataset_id to **Optional** #9450
Comments
@MarcinKowalski101 help me to understand. How it works if you don't specify the destination_dataset_id? Which dataset will be where the temporary table resides?
|
Hi |
@MarcinKowalski101 did you run the |
Hi |
@MarcinKowalski101 I'd like to see where the |
Hi |
@MarcinKowalski101 I am trying to understand what you asked below. I have never seen a
|
Ok we can continue this academic discussion or we can back to main subject and answer to main question: Can you fix the bug or can you not ? |
@MarcinKowalski101 to me this is not a bug, and destination_dataset_id can not be optional. Without the dataset specified, where does the table reside? I could be wrong. If this is the case, please provide the docs and/or steps that show creating a table without a dataset. Making sense? |
I provide all necessary documentation, I also consulted this case with GCP support to make 100% that this field is not mandatory meaning it is optional. All steps to recreate this bug is in the example. So if you don't know how to hendle it please just give information in documentation that terraform has limitation in this usecase meaning that if anyone want to use call procedure in scheduled queries terafform can't be use, instead you need to implement it directly from gcp console. |
@megan07 what do you think about this issue? |
@edwardmedia - thanks for taking a look into this! @MarcinKowalski101, sorry you're experiencing this issue. We work on numerous different resources each day, it's sometimes hard to find the exact documentation we need, so anytime you can provide us with that documentation, the quicker we can help you. After some digging, this looks specific to data transfers of datasource type |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. |
Community Note
modular-magician
user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned tohashibot
, a community member has claimed the issue already.Terraform Version
Terraform v0.14.6
provider registry.terraform.io/hashicorp/google v3.73.0
Affected Resource(s)
Expected Behavior
destination_dataset_id - (Optional) The BigQuery target dataset id.
query_configs = [
{
data_source_id = "scheduled_query"
location = "europe_west3"
display_name = "DEV call stored procedure without destination"
schedule = "every 20 minutes"
destination_dataset_id = null
params = {
query = "call DEV_BQ.merge()",
}
}
]
because the flag --target_dataset needs the query to be either DDL or DML(The CALL statement is neither DDL nor DML)
if we don't specify a destination table, the query will be saved to a temprorary table, so this means the flag --target_dataset isn't required.
bq query
--use_legacy_sql=false
--location=europe-west3
--display_name='DEV call stored procedure without destination'
--schedule='every 20 minutes'
'call DEV_BQ.merge()'
Actual Behavior
destination_dataset_id - (Required) The BigQuery target dataset id.
query_configs = [
{
data_source_id = "scheduled_query"
location = "europe-west3"
display_name = "DEV call stored procedure without destination"
schedule = "every 20 minutes"
destination_dataset_id = "DEV_BQ"
params = {
query = "call DEV_BQ.merge()",
}
}
]
we got error message :
Dataset specified in the query ('') is not consistent with Destination dataset 'DEV_BQ'.
bq query
--use_legacy_sql=false
--location=europe-west3
--display_name='DEV call stored procedure without destination'
--schedule='every 20 minutes'
--target_dataset='DEV_BQ'
'call DEV_BQ.merge()'
References
The text was updated successfully, but these errors were encountered: