Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cli: #21513 breaks custom bootstrap use case #21965

Closed
rabbittsoup opened this issue Sep 8, 2022 · 18 comments · Fixed by #22930
Closed

cli: #21513 breaks custom bootstrap use case #21965

rabbittsoup opened this issue Sep 8, 2022 · 18 comments · Fixed by #22930
Assignees
Labels
bug This issue is a bug. needs-reproduction This issue needs reproduction. p1 package/tools Related to AWS CDK Tools or CLI

Comments

@rabbittsoup
Copy link
Contributor

rabbittsoup commented Sep 8, 2022

Describe the bug

PR #21513 introduced asset prebuilding. This presumes that bootstrap resources have been deployed prior to and outside the scope of the currently deploying CDK project. However, we have restrictions on our accounts that prevent us from deploying the account wide default cdk bootstrap resources. Instead, we deploy project specific custom bootstrap resources within the currently deploying CDK project. The project specific custom bootstrap resource stack is deployed immediately before the project's main stacks, which allows us to bootstrap a specific project and meet our account requirements. This scenario means the bootstrap resources are not available before initial project deployment, and results in a "Error: Building Assets Failed" when the project includes assets, even though the bootstrap resources will be available by the time the assets are needed.

Expected Behavior

Bootstrap resources get used just-in-time.

Current Behavior

Bootstrap resources get used before project deployment.

Reproduction Steps

app.py.txt

Possible Solution

Rollback PR #21513?

Additional Information/Context

No response

CDK CLI Version

2.39.0

Framework Version

2.39.0

Node.js Version

14.16.0

OS

Amazon Linux 2

Language

Python

Language Version

3.8.13

Other information

No response

@rabbittsoup rabbittsoup added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Sep 8, 2022
@github-actions github-actions bot added the package/tools Related to AWS CDK Tools or CLI label Sep 8, 2022
@mrgrain
Copy link
Contributor

mrgrain commented Sep 8, 2022

Hey @rabbittsoup Thanks for reporting this. Interestingly enough, your exact use case came up in the discussions of the PR you mentioned (#21513 (comment)). So I'm mildly surprised if we accidentally made it harder for you.

To help you out, we'd really need a (minimal) reproduction example. Would you be able to provide this?

@rix0rrr
Copy link
Contributor

rix0rrr commented Sep 8, 2022

Can you explain how this change breaks you? Only building has been pulled to the front, which shouldn't depend on the bootstrap resources being present.

Alternatively, you could start fresh deployments with:

cdk deploy -e MyCustomBootstrapStack

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Sep 8, 2022

@rix0rrr Looking at my deployment logs, I see that publishing occurs just before the individual stack. However, the failure I encountered appears to be because the prebuild is still looking for the bootstrap version parameter.

Building assets failed: Error: Building Assets Failed: Error: Stack: SSM parameter /Stack-cdk/version not found. Has the environment been bootstrapped? Please run 'cdk bootstrap' (see https://docs.aws.amazon.com/cdk/latest/guide/bootstrapping.html)
    at buildAllStackAssets (/app/cdk/node_modules/aws-cdk/lib/build.ts:21:11)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at CdkToolkit.deploy (/app/cdk/node_modules/aws-cdk/lib/cdk-toolkit.ts:174:7)
    at initCommandLine (/app/cdk/node_modules/aws-cdk/lib/cli.ts:349:12)

@mrgrain I'll try to put together a smallish example showcasing the project specific custom bootstrap pattern we use. Overall, the pattern has several advantages over the account wide shared bootstrap resources, which can create unmanageable resource use in large accounts.

@peterwoodworth
Copy link
Contributor

Let us know when you can put together that repro @rabbittsoup

@peterwoodworth peterwoodworth added p1 response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. needs-reproduction This issue needs reproduction. and removed needs-triage This issue or PR still needs to be triaged. labels Sep 14, 2022
@rabbittsoup
Copy link
Contributor Author

Sorry, I’m not going to get to this for a bit.

Very, very roughly I’m using a custom synthesizer extending the default synthesizer, which is tightly coupled to a couple separate bootstrapless stacks that are implicitly instantiated with the extended synthesizer as stage level siblings to the user’s stack. The bootstrapless stacks implement the bootstrap resources as needed.

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. label Sep 15, 2022
@rix0rrr
Copy link
Contributor

rix0rrr commented Sep 19, 2022

Can your custom synthesizer disable version checking on that SSM parameter?

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Sep 19, 2022

I could, I suppose. However, the pattern is library provided and the end user generally doesn't need to know much about it, so the versioning could still be beneficial. I will look into that further, though. I am able to temporarily workaround the issue at the moment by specifying the specific bootstrap stacks before then deploying all as you also suggested.

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Sep 21, 2022

@rix0rrr It looks like setting generateBootstrapVersionRule in the synthesizer does not workaround the problem. Although the setting does prevent the rule from being generated in the templates, the use of assets in future stacks still generates version requirements in the manifest which still causes the cli deploy to fail with the version error.

@rabbittsoup
Copy link
Contributor Author

I updated the original post with an example app.py.

@rix0rrr
Copy link
Contributor

rix0rrr commented Oct 6, 2022

I gotcha. I think the pre-build doesn't need to check the parameter at all, that is just necessary for the publish.

@rix0rrr
Copy link
Contributor

rix0rrr commented Oct 7, 2022

Oh no of course it does, because Docker builds are crazy and we need to assume roles and check ECR repositories so that Docker can load cache from the existing repo.

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Nov 4, 2022

@rix0rrr Any movement on accommodating this pattern? The issue still causes us grief.

@rix0rrr
Copy link
Contributor

rix0rrr commented Nov 8, 2022

Sorry, dropped off my radar.

I think the simplest thing we can do is add a CLI flag to disable the prebuild. Would that suffice for you?

@rabbittsoup
Copy link
Contributor Author

I'm not familiar enough with the prebuild changes to understand what tradeoffs that would introduce? What might not work when using such a flag?

@rix0rrr
Copy link
Contributor

rix0rrr commented Nov 9, 2022

The PR introducing the problem was trying to move all builds to the front of the deployment process, so that if the build fails potentially slow deploys are avoided.

It would be a flag to revert to the old behavior: do just-in-time building, interleaved in between deploys.

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Nov 10, 2022

Do you think it would be possible to do a flag-style decision for this at runtime, maybe passed into the App instance? For instance, suppose that maybe the app.py (or similar) top level cdk app code could check something, or in my scenario check if the bootstrap version parameter exists, and then choose pre-build or interleave-build option?

I suppose similar logic could also be done in a bash script if perhaps there was a way to check for bootstrap version parameter existence using the cdk credentials. I suppose this may be the same as aws cli commands and credentials 🤔

rix0rrr added a commit that referenced this issue Nov 16, 2022
If people have managed to adjust the bootstrapping process to have
bootstrap resources be created as part of normal stack deploys, the
eager asset prebuild breaks the very first deploy (since it requires
bootstrap resources to already be present).

Allow disabling the prebuild by passing `--no-asset-prebuild`.

Fixes #21965.
@mergify mergify bot closed this as completed in #22930 Nov 16, 2022
mergify bot pushed a commit that referenced this issue Nov 16, 2022
If people have managed to adjust the bootstrapping process to have bootstrap resources be created as part of normal stack deploys, the eager asset prebuild breaks the very first deploy (since it requires bootstrap resources to already be present).

Allow disabling the prebuild by passing `--no-asset-prebuild`.

Fixes #21965.


----
*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
@github-actions
Copy link

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

@rabbittsoup
Copy link
Contributor Author

rabbittsoup commented Nov 22, 2022

@rix0rrr The behavior of all command line argument defaults overriding cdk.json and ~/.cdk.json appears to be getting in the way of using the newly added assetPrebuild option you added in #22930 to solve our prebuild pains. issue #21513 is not fixed without the option to use cdk.json or ~/.cdk.json. The cdk.json option problem is detailed in issue #3573

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. needs-reproduction This issue needs reproduction. p1 package/tools Related to AWS CDK Tools or CLI
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants