-
Couldn't load subscription status.
- Fork 2.4k
Description
Problem (one or two sentences)
Support Global inference feature automatically to route requests to the optimal AWS Region for supported models, helping optimize resource utilization and increase throughput.
Context (who is affected and when)
Some Bedrock models support Global inference. Those models are frequently with higher quota and provide additional capacity. With global inference profiles, Amazon Bedrock automatically selects the optimal commercial AWS Region to process the request, which optimizes available resources and increases model throughput. Some models supported are Claude 4, Claude 4.5, Haiku 4.5.
Desired behavior (conceptual, not technical)
Checkbox available under the Bedrock API Provider, for the models supporting Global inference. The experience should be similar to "use cross-region inference"
Constraints / preferences (optional)
No response
Request checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear context and impact
Roo Code Task Links (optional)
No response
Acceptance criteria (optional)
No response
Proposed approach (optional)
No response
Trade-offs / risks (optional)
No response
Metadata
Metadata
Assignees
Labels
Type
Projects
Status