Skip to content

[ENHANCEMENT] Global Inference for Bedrock models supported #8750

@ronyblum

Description

@ronyblum

Problem (one or two sentences)

Support Global inference feature automatically to route requests to the optimal AWS Region for supported models, helping optimize resource utilization and increase throughput.

Context (who is affected and when)

Some Bedrock models support Global inference. Those models are frequently with higher quota and provide additional capacity. With global inference profiles, Amazon Bedrock automatically selects the optimal commercial AWS Region to process the request, which optimizes available resources and increases model throughput. Some models supported are Claude 4, Claude 4.5, Haiku 4.5.

Desired behavior (conceptual, not technical)

Checkbox available under the Bedrock API Provider, for the models supporting Global inference. The experience should be similar to "use cross-region inference"

Constraints / preferences (optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear context and impact

Roo Code Task Links (optional)

No response

Acceptance criteria (optional)

No response

Proposed approach (optional)

No response

Trade-offs / risks (optional)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.enhancementNew feature or request

    Type

    No type

    Projects

    Status

    Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions