feat(ovhcloud): Add support of responses API#22902
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile SummaryThis PR adds Key changes:
Minor issue found:
Confidence Score: 4/5
Sequence DiagramsequenceDiagram
participant Caller
participant ProviderConfigManager
participant OVHCloudResponsesAPIConfig
participant OpenAIResponsesAPIConfig
participant get_model_info
Caller->>ProviderConfigManager: get_provider_responses_api_config(model, OVHCLOUD)
ProviderConfigManager-->>Caller: OVHCloudResponsesAPIConfig()
Caller->>OVHCloudResponsesAPIConfig: validate_environment(headers, model, litellm_params)
OVHCloudResponsesAPIConfig->>OVHCloudResponsesAPIConfig: resolve api_key (params → litellm.api_key → litellm.ovhcloud_key → OVHCLOUD_API_KEY env)
OVHCloudResponsesAPIConfig-->>Caller: headers with Authorization: Bearer <key>
Caller->>OVHCloudResponsesAPIConfig: get_complete_url(api_base, litellm_params)
OVHCloudResponsesAPIConfig->>OVHCloudResponsesAPIConfig: resolve api_base (param → litellm.api_base → OVHCLOUD_API_BASE env → default)
OVHCloudResponsesAPIConfig-->>Caller: https://...ovh.net/v1/responses
Caller->>OVHCloudResponsesAPIConfig: get_supported_openai_params(model)
OVHCloudResponsesAPIConfig->>OpenAIResponsesAPIConfig: super().get_supported_openai_params(model)
OpenAIResponsesAPIConfig-->>OVHCloudResponsesAPIConfig: full OpenAI Responses API param list
OVHCloudResponsesAPIConfig->>get_model_info: get_model_info(model, "ovhcloud")
get_model_info-->>OVHCloudResponsesAPIConfig: supports_function_calling (true/false/exception)
OVHCloudResponsesAPIConfig->>OVHCloudResponsesAPIConfig: remove tools, tool_choice if not supported
OVHCloudResponsesAPIConfig-->>Caller: filtered param list
Last reviewed commit: e3eed6e |
tests/test_litellm/llms/ovhcloud/responses/test_ovhcloud_responses_transformation.py
Outdated
Show resolved
Hide resolved
8ec1f3f to
b7b5051
Compare
|
@greptileai Do a review of my PR please |
| ) | ||
|
|
||
| assert "Authorization" in result | ||
| assert result["Authorization"] == "Bearer test-api-key-123" | ||
|
|
||
| def test_validate_environment_missing_api_key(self): | ||
| """Test that validate_environment raises error when API key is missing""" | ||
| config = OVHCloudResponsesAPIConfig() | ||
|
|
||
| headers = {} | ||
|
|
||
| with pytest.raises(ValueError, match="OVHcloud AI Endpoints API key is required"): | ||
| config.validate_environment( | ||
| headers=headers, | ||
| model="ovhcloud/gpt-oss-120b", | ||
| litellm_params=None |
There was a problem hiding this comment.
Test may not raise due to uncleared global state
test_validate_environment_missing_api_key relies on litellm.api_key, litellm.ovhcloud_key, and the OVHCLOUD_API_KEY environment variable all being falsy. If any other test in the suite has set litellm.api_key (a common global that many providers fall back to), this test will silently pass through validate_environment without hitting the raise ValueError and pytest will fail with DID NOT RAISE.
The fix is to mock/patch the global keys and the secret getter for the duration of this test:
from unittest.mock import patch
def test_validate_environment_missing_api_key(self):
"""Test that validate_environment raises error when API key is missing"""
config = OVHCloudResponsesAPIConfig()
headers = {}
with (
patch.object(litellm, "api_key", None),
patch.object(litellm, "ovhcloud_key", None),
patch("litellm.llms.ovhcloud.responses.transformation.get_secret_str", return_value=None),
):
with pytest.raises(ValueError, match="OVHcloud AI Endpoints API key is required"):
config.validate_environment(
headers=headers,
model="ovhcloud/gpt-oss-120b",
litellm_params=None,
)
tests/test_litellm/llms/ovhcloud/responses/test_ovhcloud_responses_transformation.py
Outdated
Show resolved
Hide resolved
3c95588
into
BerriAI:litellm_oss_staging_03_06_2026
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/test_litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit@greptileaiand received a Confidence Score of at least 4/5 before requesting a maintainer reviewCI (LiteLLM team)
Branch creation CI run
Link:
CI run for the last commit
Link:
Merge / cherry-pick CI run
Links:
Type
🆕 New Feature
Changes
Add support of
/v1/responsesfor OVHcloud AI Endpoints