Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log_analytics_workspace: support for the daily_quota_gb property #8861

Merged
merged 7 commits into from
Oct 22, 2020

Conversation

Lucretius
Copy link
Contributor

Resolves #3288

The daily volume cap is off by default as per the docs, so I've set the default value to -1 which means unlimited (no cap). Validation is set to either be -1 or some non-negative value.

One new test per resource/data source added, and one old test updated to confirm default value.

@Lucretius Lucretius changed the title added property, docs, tests log_analytics_workspace: enable daily volume cap setting Oct 13, 2020
Copy link
Collaborator

@magodo magodo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this PR!
I have left some additional mostly minor comments that once addressed this should be good to merge 👍

@@ -146,6 +146,27 @@ func TestAccAzureRMLogAnalyticsWorkspace_withDefaultSku(t *testing.T) {
Config: testAccAzureRMLogAnalyticsWorkspace_withDefaultSku(data),
Check: resource.ComposeTestCheckFunc(
testCheckAzureRMLogAnalyticsWorkspaceExists(data.ResourceName),
resource.TestCheckResourceAttr(data.ResourceName, "daily_quota_gb", "-1"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check is redundent here as it will be checked in the data.ImportStep().

Config: testAccAzureRMLogAnalyticsWorkspace_withVolumeCap(data, 4.5),
Check: resource.ComposeTestCheckFunc(
testCheckAzureRMLogAnalyticsWorkspaceExists(data.ResourceName),
resource.TestCheckResourceAttr(data.ResourceName, "daily_quota_gb", "4.5"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check is redundent here as it will be checked in the data.ImportStep().

@@ -43,6 +43,8 @@ The following arguments are supported:

* `retention_in_days` - (Optional) The workspace data retention in days. Possible values are either 7 (Free Tier only) or range between 30 and 730.

* `daily_quota_gb` - (Optional) The workspace of the volume cap in gb. Defaults to -1 (unlimited).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we adjust the statement a bit?

Suggested change
* `daily_quota_gb` - (Optional) The workspace of the volume cap in gb. Defaults to -1 (unlimited).
* `daily_quota_gb` - The workspace daily quota for ingestion in GB. Defaults to -1 (unlimited).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not want the (Optional) in the resource?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we do

@@ -46,6 +46,8 @@ The following attributes are exported:

* `retention_in_days` - The workspace data retention in days.

* `daily_quota_gb` - The workspace of the volume cap in gb. Defaults to -1 (unlimited).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we adjust the statement a bit? And no need to mention the default value for data source.

Suggested change
* `daily_quota_gb` - The workspace of the volume cap in gb. Defaults to -1 (unlimited).
* `daily_quota_gb` - The workspace daily quota for ingestion in GB.

@Lucretius
Copy link
Contributor Author

Thanks for the response @magodo. If I understand correctly then, the data.ImportStep() checks that every attribute for a resource is set to the values defined in the test terraform snippets which should be written to state? Is there any reason to ever use TestCheckResourceAttr on individual attributes instead of just using data.ImportStep()? I have a few other PRs I will want to go back and alter if a single data.ImportStep() is the desired way to verify attributes match terraform test snippets.

Copy link
Member

@jackofallops jackofallops left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @Lucretius - I was running through a review, but I see you're updating while I'm part done, so I'll leave this partial review here and come back later :)

resource_group_name = azurerm_resource_group.test.name
sku = "PerGB2018"
retention_in_days = 30
daily_quota_gb = %d
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect type here for float:

Suggested change
daily_quota_gb = %d
daily_quota_gb = %f

@@ -130,6 +137,7 @@ func resourceArmLogAnalyticsWorkspaceCreateUpdate(d *schema.ResourceData, meta i
}

retentionInDays := int32(d.Get("retention_in_days").(int))
dailyQuotaGb := float64(d.Get("daily_quota_gb").(float64))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you need the cast here as you're asserting a float64, rather than converting between types?

@magodo
Copy link
Collaborator

magodo commented Oct 14, 2020

@Lucretius iirc the only moments you willl need to use TestCheckResourceAttr/TestCheckResourceAttrSet is when the attributes you are checking are computed, e.g. when checking against a data source.

@Lucretius
Copy link
Contributor Author

I think I just need to kick the tests off for this one again, as it appears one of the linter tools had an error on downloading as opposed to breaking on an individual step. @jackofallops you are free to continue your review, I fixed the two things you pointed out. Thanks for bearing with me as I refresh my Go :)

@ghost ghost removed the waiting-response label Oct 20, 2020
Copy link
Collaborator

@katbyte katbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Lucretius - this LGTM 👍

@katbyte katbyte added this to the v2.34.0 milestone Oct 22, 2020
@katbyte katbyte changed the title log_analytics_workspace: enable daily volume cap setting log_analytics_workspace: support for the daily_quota_gb property Oct 22, 2020
@katbyte katbyte merged commit 539b9e3 into hashicorp:master Oct 22, 2020
katbyte added a commit that referenced this pull request Oct 22, 2020
@ghost
Copy link

ghost commented Oct 29, 2020

This has been released in version 2.34.0 of the provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading. As an example:

provider "azurerm" {
    version = "~> 2.34.0"
}
# ... other configuration ...

@ghost
Copy link

ghost commented Nov 22, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!

@ghost ghost locked as resolved and limited conversation to collaborators Nov 22, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

log_analytics_workspace: enable daily volume cap setting
4 participants