Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support BigQuery authorized routines #6680

Merged
merged 11 commits into from
Nov 8, 2022
Merged

Support BigQuery authorized routines #6680

merged 11 commits into from
Nov 8, 2022

Conversation

Tei1988
Copy link
Contributor

@Tei1988 Tei1988 commented Oct 11, 2022

Add support for authorized routines on BigQuery.

Fixes hashicorp/terraform-provider-google#11501 .

I just tested to assign some routines as an authorized routines to some datasets using local built provider.
But I can not run acceptance tests.

If this PR is for Terraform, I acknowledge that I have:

  • Searched through the issue tracker for an open issue that this either resolves or contributes to, commented on it to claim it, and written "fixes {url}" or "part of {url}" in this PR description. If there were no relevant open issues, I opened one and commented that I would like to work on it (not necessary for very small changes).
  • Generated Terraform, and ran make test and make lint to ensure it passes unit and linter tests.
  • Ensured that all new fields I added that can be set by a user appear in at least one example (for generated resources) or third_party test (for handwritten resources or update tests).
  • Ran relevant acceptance tests (If the acceptance tests do not yet pass or you are unable to run them, please let your reviewer know).
  • Read the Release Notes Guide before writing my release note below.

Release Note Template for Downstream PRs (will be copied)

bigquery: supported authorized routines on resource `bigquery_dataset` and `bigquery_dataset_access`

@modular-magician
Copy link
Collaborator

Hello! I am a robot who works on Magic Modules PRs.

I've detected that you're a community contributor. @melinath, a repository maintainer, has been assigned to assist you and help review your changes.

❓ First time contributing? Click here for more details

Your assigned reviewer will help review your code by:

  • Ensuring it's backwards compatible, covers common error cases, etc.
  • Summarizing the change into a user-facing changelog note.
  • Passes tests, either our "VCR" suite, a set of presubmit tests, or with manual test runs.

You can help make sure that review is quick by running local tests and ensuring they're passing in between each push you make to your PR's branch. Also, try to leave a comment with each push you make, as pushes generally don't generate emails.

If your reviewer doesn't get back to you within a week after your most recent change, please feel free to leave a comment on the issue asking them to take a look! In the absence of a dedicated review dashboard most maintainers manage their pending reviews through email, and those will sometimes get lost in their inbox.


@modular-magician
Copy link
Collaborator

Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are.

Diff report:

Terraform GA: Diff ( 5 files changed, 450 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 5 files changed, 450 insertions(+), 7 deletions(-))
TF Validator: Diff ( 3 files changed, 55 insertions(+), 3 deletions(-))
TF OiCS: Diff ( 4 files changed, 136 insertions(+))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2216
Passed tests 1968
Skipped tests: 240
Failed tests: 8

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccFirebaserulesRelease_BasicRelease|TestAccComputeInstance_soleTenantNodeAffinities|TestAccCGCSnippet_eventarcWorkflowsExample|TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample|TestAccBigQueryDataset_bigqueryDatasetAuthorizedDatasetExample|TestAccBigQueryDataset_bigqueryDatasetBasicExample|TestAccBigQueryDataset_access|TestAccBigQueryDataset_basic

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccFirebaserulesRelease_BasicRelease[Debug log]
TestAccBigQueryDataset_bigqueryDatasetAuthorizedDatasetExample[Debug log]
TestAccBigQueryDataset_bigqueryDatasetBasicExample[Debug log]
TestAccBigQueryDataset_access[Debug log]
TestAccBigQueryDataset_basic[Debug log]

Tests failed during RECORDING mode:
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]
TestAccCGCSnippet_eventarcWorkflowsExample[Error message] [Debug log]
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@Tei1988
Copy link
Contributor Author

Tei1988 commented Oct 20, 2022

@melinath
Hi,
Would you mind telling what should I do next ?

Copy link
Member

@melinath melinath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for checking in & apologies for the delayed review. It looks like one of the tests is currently failing with the following error:

    provider_test.go:315: Step 1/2 error: After applying this test step and performing a `terraform refresh`, the plan was not empty.
        stdout
        
        
        Terraform used the selected providers to generate the following execution
        plan. Resource actions are indicated with the following symbols:
          ~ update in-place
        
        Terraform will perform the following actions:
        
          # google_bigquery_dataset.private will be updated in-place
          ~ resource "google_bigquery_dataset" "private" {
                id                              = "projects/ci-project-id/datasets/privatefcsorz706l"
                # (12 unchanged attributes hidden)
        
              - access {
        
                  - routine {
                      - dataset_id = "publicfcsorz706l" -> null
                      - project_id = "ci-project-id" -> null
                      - routine_id = "sample_routine" -> null
                    }
                }
              - access {
                  - role          = "OWNER" -> null
                  - user_by_email = "[email protected]" -> null
                }
              - access {
                  - role          = "OWNER" -> null
                  - special_group = "projectOwners" -> null
                }
              - access {
                  - role          = "READER" -> null
                  - special_group = "projectReaders" -> null
                }
              - access {
                  - role          = "WRITER" -> null
                  - special_group = "projectWriters" -> null
                }
              + access {
        
                  + routine {
                      + dataset_id = "publicfcsorz706l"
                      + project_id = "ci-project-id"
                      + routine_id = "sample_routine"
                    }
                }
            }
        
        Plan: 0 to add, 1 to change, 0 to destroy.

This means there's a permadiff - the change isn't being persisted properly in the API. Looking into the debug logs, I see the following request / response cycle:

---[ REQUEST ]---------------------------------------
POST /bigquery/v2/projects/ci-project-id/datasets?alt=json HTTP/1.1
Host: bigquery.googleapis.com
User-Agent: Terraform/1.1.8 (+https://www.terraform.io) Terraform-Plugin-SDK/2.10.1 terraform-provider-google-beta/acc
Content-Length: 233
Content-Type: application/json
Accept-Encoding: gzip

{
 "access": [
  {
   "routine": {
    "datasetId": "publicfcsorz706l",
    "projectId": "ci-project-id",
    "routineId": "sample_routine"
   }
  }
 ],
 "datasetReference": {
  "datasetId": "privatefcsorz706l"
 },
 "description": "This dataset is private",
 "location": "US"
}

-----------------------------------------------------
2022/10/11 11:51:08 [DEBUG] Google API Response Details:
---[ RESPONSE ]--------------------------------------
HTTP/2.0 200 OK
Cache-Control: private
Content-Type: application/json; charset=UTF-8
Date: Tue, 11 Oct 2022 11:51:08 GMT
Etag: pn0YMy4Q9HkSyI6NrE2O0g==
Server: ESF
Vary: Origin
Vary: X-Origin
Vary: Referer
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Xss-Protection: 0

{
  "kind": "bigquery#dataset",
  "etag": "pn0YMy4Q9HkSyI6NrE2O0g==",
  "id": "ci-project-id:privatefcsorz706l",
  "selfLink": "https://bigquery.googleapis.com/bigquery/v2/projects/ci-project-id/datasets/privatefcsorz706l",
  "datasetReference": {
    "datasetId": "privatefcsorz706l",
    "projectId": "ci-project-id"
  },
  "description": "This dataset is private",
  "access": [
    {
      "role": "WRITER",
      "specialGroup": "projectWriters"
    },
    {
      "role": "OWNER",
      "specialGroup": "projectOwners"
    },
    {
      "role": "OWNER",
      "userByEmail": "[email protected]"
    },
    {
      "role": "READER",
      "specialGroup": "projectReaders"
    },
    {
      "routine": {
        "projectId": "ci-project-id",
        "datasetId": "publicfcsorz706l",
        "routineId": "sample_routine"
      }
    }
  ],
  "creationTime": "1665489068549",
  "lastModifiedTime": "1665489068549",
  "location": "US",
  "type": "DEFAULT"
}

It looks like the API is adding a bunch of access entries that are not present in the request object.

This seems to correspond to hashicorp/terraform-provider-google#8370

I think that, as a workaround, you could specify a legacy role on the resource - I've added annotations on how I think you could do this below.

If the workaround doesn't work - one option would be to try to resolve the permadiff ticket in a separate PR first. Otherwise I'll have to think about what you could do.

mmv1/products/bigquery/terraform.yaml Outdated Show resolved Hide resolved
Tei1988 and others added 2 commits October 21, 2022 12:43
@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
TF Validator: Diff ( 3 files changed, 55 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 5 files changed, 476 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 5 files changed, 476 insertions(+), 7 deletions(-))
TF Validator: Diff ( 3 files changed, 55 insertions(+), 3 deletions(-))

@Tei1988
Copy link
Contributor Author

Tei1988 commented Oct 21, 2022

@melinath
Thank you for your review and suggestions.
It was very helpful.

I added legacy roles to test case.
I hope that it will run well.

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2201
Passed tests 1960
Skipped tests: 239
Failed tests: 2

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample|TestAccBigQueryDataset_bigqueryDatasetWithMaxTimeTravelHoursExample

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccBigQueryDataset_bigqueryDatasetWithMaxTimeTravelHoursExample[Debug log]

Tests failed during RECORDING mode:
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

description = "This dataset is private"
access {
role = "WRITER"
specialGroup = "projectWriters"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this would need to be special_group to match the Terraform field name - but I'd prefer not including these special_group blocks. If we just match what the API is returning we won't be able to tell whether it's acting on what we send it or it's ignoring what we sent but it just happens to match what's in the config right now.

Copy link
Contributor Author

@Tei1988 Tei1988 Oct 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@melinath
Thank you for your comment.
Sorry for my misunderstanding about your advice.
I agree with your opinion.
I removed them.

On the other hands, I'm worried about the failed test.

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
TF Validator: Diff ( 3 files changed, 55 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2204
Passed tests 1963
Skipped tests: 239
Failed tests: 2

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccFirebaserulesRelease_BasicRelease|TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccFirebaserulesRelease_BasicRelease[Debug log]

Tests failed during RECORDING mode:
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

name: "bigquery_dataset_authorized_routine"
primary_resource_id: "dataset"
vars:
private: "private"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These need a hyphen so that they automatically get prefixed with tf-test (so that they can be easily cleaned up after the tests run, if something goes wrong.)

Suggested change
private: "private"
private: "private-dataset"
public: "public-dataset"


resource "google_bigquery_routine" "public" {
dataset_id = google_bigquery_dataset.public.dataset_id
routine_id = "sample_routine"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this automatically deleted if the dataset is deleted? And/or is there any issue with quotas for this resource, or name conflicts if parallel test runs happen at the same time? Otherwise it might be worth making this use a var as well.

skip_test: true # not importable
primary_resource_id: "access"
vars:
private: "private"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

@@ -41,6 +41,14 @@ overrides: !ruby/object:Overrides::ResourceOverrides
private: "private"
public: "public"
account_name: "bqowner"
- !ruby/object:Provider::Terraform::Examples
name: "bigquery_dataset_authorized_routine"
primary_resource_id: "dataset"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs to match the second part of a dataset resource's terraform id. For example, for:

resource "google_bigquery_dataset" "public" {
    // args
}

the resource address is google_bigquery_dataset.public, so primary_resource_id would need to be public.

The "primary resource" will be imported as part of the test. In this case, the important thing is to make sure that imports work correctly for the routine field (which is on the private dataset)

Suggested change
primary_resource_id: "dataset"
primary_resource_id: "private"

- !ruby/object:Provider::Terraform::Examples
name: "bigquery_dataset_access_authorized_routine"
skip_test: true # not importable
primary_resource_id: "access"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this example, the routine is actually set on google_bigquery_dataset_access.authorized_routine, which is not the expected resource type, so you'll need to set that explicitly as well.

Suggested change
primary_resource_id: "access"
primary_resource_type: "google_bigquery_dataset_access"
primary_resource_id: "authorized_routine"

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 5 files changed, 452 insertions(+), 7 deletions(-))
TF Validator: Diff ( 3 files changed, 55 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2208
Passed tests 1968
Skipped tests: 239
Failed tests: 1

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample

@modular-magician
Copy link
Collaborator

Tests failed during RECORDING mode:
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample[Debug log]

All tests passed
View the build log or the debug log for each test

Copy link
Member

@melinath melinath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Congrats on getting the test passing! This is looking really great. I'm sorry for just noticing this, but it looks like the dataset access tests are not being generated because dataset access is not importable. This means that you'll need to add some handwritten tests for the routine field on the dataset_access resource.

Existing tests for that resource are at https://github.com/GoogleCloudPlatform/magic-modules/blob/main/mmv1/third_party/terraform/tests/resource_bigquery_dataset_access_test.go

The easiest way to do this might be to uncomment the "skip test" line, generate the test, then copy it over to the other file and remove the import lines (since it's not importable.)

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 7 files changed, 534 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 7 files changed, 534 insertions(+), 7 deletions(-))
TF Validator: Diff ( 4 files changed, 59 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 0
Passed tests 0
Skipped tests: 0
Failed tests: 0

Errors occurred during REPLAYING mode. Please fix them to complete your PR
View the build log

Copy link
Member

@melinath melinath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't quite fully fix it but this should be a start.

vcrTest(t, resource.TestCase{
PreCheck: func() { testAccPreCheck(t) },
Providers: testAccProviders,
CheckDestroy: testAccCheckBigQueryDatasetAccessDestroyProducer(t),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like this is causing the tests to fail to compile:

google/resource_bigquery_dataset_access_test.go:126:17: undefined: testAccCheckBigQueryDatasetAccessDestroyProducer

It looks like the other tests skip having a CheckDestroy and handle this in another way.

Suggested change
CheckDestroy: testAccCheckBigQueryDatasetAccessDestroyProducer(t),

Check: testAccCheckBigQueryDatasetAccessPresent(t, "google_bigquery_dataset.private_dataset", expected),
},
{
Config: testAccBigQueryDatasetAccess_destroy(context["private_dataset"], "private_dataset"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like this is causing the tests to fail to compile:

google/resource_bigquery_dataset_access_test.go:133:50: cannot use context["private_dataset"] (map index expression of type interface{}) as type string in argument to testAccBigQueryDatasetAccess_destroy:

It looks like the simplest fix is to add a type assertion here.

Suggested change
Config: testAccBigQueryDatasetAccess_destroy(context["private_dataset"], "private_dataset"),
// Destroy step instead of CheckDestroy so we can check the access is removed without deleting the dataset
Config: testAccBigQueryDatasetAccess_destroy(context["private_dataset"].(string), "private"),

@@ -100,6 +100,43 @@ func TestAccBigQueryDatasetAccess_authorizedDataset(t *testing.T) {
})
}

func TestAccBigQueryDatasetAccess_authorizedRoutine(t *testing.T) {
t.Parallel()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like these can't run in VCR based on the surrounding tests.

Suggested change
t.Parallel()
// Multiple fine-grained resources
skipIfVcr(t)
t.Parallel()

Steps: []resource.TestStep{
{
Config: testAccBigQueryDatasetAccess_authorizedRoutine(context),
Check: testAccCheckBigQueryDatasetAccessPresent(t, "google_bigquery_dataset.private_dataset", expected),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Check: testAccCheckBigQueryDatasetAccessPresent(t, "google_bigquery_dataset.private_dataset", expected),
Check: testAccCheckBigQueryDatasetAccessPresent(t, "google_bigquery_dataset.private", expected),

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs to match what's in the config

},
{
Config: testAccBigQueryDatasetAccess_destroy(context["private_dataset"], "private_dataset"),
Check: testAccCheckBigQueryDatasetAccessAbsent(t, "google_bigquery_dataset.private_dataset", expected),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Check: testAccCheckBigQueryDatasetAccessAbsent(t, "google_bigquery_dataset.private_dataset", expected),
Check: testAccCheckBigQueryDatasetAccessAbsent(t, "google_bigquery_dataset.private", expected),

}

expected := map[string]interface{}{
"dataset": map[string]interface{}{
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay I think this plus the above changes should do this. Basically, the nesting inside dataset here and the targetTypes field below seem to be left over from a different test. For this test, the expected structure is:


	expected := map[string]interface{}{
		"routine": map[string]interface{}{
			"projectId": getTestProjectFromEnv(),
			"datasetId": context["public_dataset"],
			"routineId": context["public_routine"],
		},
	}

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 7 files changed, 533 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 7 files changed, 533 insertions(+), 7 deletions(-))
TF Validator: Diff ( 4 files changed, 59 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2218
Passed tests 1977
Skipped tests: 240
Failed tests: 1

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccFirebaserulesRelease_BasicRelease

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccFirebaserulesRelease_BasicRelease[Debug log]

All tests passed
View the build log or the debug log for each test

@melinath
Copy link
Member

melinath commented Nov 7, 2022

Manually running tests out of caution / because one is skipIfVcr:

TestAccBigQueryDatasetAccess_authorizedRoutine|TestAccBigQueryDataset_bigqueryDatasetAuthorizedRoutineExample

https://ci-oss.hashicorp.engineering/buildConfiguration/GoogleCloudBeta_ProviderGoogleCloudBetaMmUpstream/351167

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

Terraform GA: Diff ( 7 files changed, 533 insertions(+), 7 deletions(-))
Terraform Beta: Diff ( 7 files changed, 533 insertions(+), 7 deletions(-))
TF Validator: Diff ( 4 files changed, 59 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2224
Passed tests 1983
Skipped tests: 241
Failed tests: 0

All tests passed in REPLAYING mode
View the build log

@melinath
Copy link
Member

melinath commented Nov 8, 2022

Copy link
Member

@melinath melinath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM - thanks for your work on this!

@Tei1988
Copy link
Contributor Author

Tei1988 commented Nov 9, 2022

@melinath Thank you for your long term support. 🥇

@Tei1988 Tei1988 deleted the add-bq-authorized-routine branch November 9, 2022 00:13
hao-nan-li pushed a commit to hao-nan-li/magic-modules that referenced this pull request Dec 6, 2022
* BigQuery: Added support for authorized routine

* BigQuery: Added examples for authorized routine

* Update mmv1/products/bigquery/terraform.yaml

Co-authored-by: Stephen Lewis (Burrows) <[email protected]>

* Update mmv1/templates/terraform/examples/bigquery_dataset_authorized_routine.tf.erb

Co-authored-by: Stephen Lewis (Burrows) <[email protected]>

* add legacy iam access to test case

* remove legacy roles

* correct variable names and values

* replace - to _

* [wip]added hand written test

* fix corrupted test

* Update mmv1/third_party/terraform/tests/resource_bigquery_dataset_access_test.go

Co-authored-by: Stephen Lewis (Burrows) <[email protected]>

Co-authored-by: Stephen Lewis (Burrows) <[email protected]>
@ScottSuarez
Copy link
Contributor

ScottSuarez commented Aug 13, 2024

closes #6680

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support BigQuery authorized routines
4 participants