Skip to content

litellm_fix(hosted_vllm): use custom client when provided#20158

Closed
shin-bot-litellm wants to merge 1 commit intomainfrom
fix-hosted-vllm-custom-client
Closed

litellm_fix(hosted_vllm): use custom client when provided#20158
shin-bot-litellm wants to merge 1 commit intomainfrom
fix-hosted-vllm-custom-client

Conversation

@shin-bot-litellm
Copy link
Contributor

Summary

Fixes failing CI tests for hosted_vllm provider:

  • test_openai_compatible_custom_api_video[hosted_vllm]
  • test_openai_compatible_custom_api_base[hosted_vllm]

Root Cause

PR #19893 added a dedicated hosted_vllm branch in main.py that routes all requests through base_llm_http_handler.completion() to support ssl_verify. However, this broke the behavior when a custom OpenAI client is passed - the client was ignored because base_llm_http_handler uses its own httpx client internally.

Fix

When a custom client parameter is provided:

  • Use openai_chat_completions.completion() which properly utilizes the passed client
  • This ensures the client is used correctly (e.g., for mocking in tests)

When no custom client is provided:

Testing

Both failing tests now pass:

tests/local_testing/test_completion.py::test_openai_compatible_custom_api_base[hosted_vllm] PASSED
tests/local_testing/test_completion.py::test_openai_compatible_custom_api_video[hosted_vllm] PASSED

The ssl_verify tests from #19893 still pass:

tests/test_litellm/llms/hosted_vllm/chat/test_hosted_vllm_ssl_verify.py::TestHostedVLLMSSLVerify::test_hosted_vllm_ssl_verify_false_async PASSED
tests/test_litellm/llms/hosted_vllm/chat/test_hosted_vllm_ssl_verify.py::TestHostedVLLMSSLVerify::test_hosted_vllm_ssl_verify_false_sync PASSED

CircleCI Link

https://app.circleci.com/pipelines/github/BerriAI/litellm/56610/workflows/ddae19f6-881e-49e1-922e-005927a11564/jobs/1135770/tests

When a custom OpenAI client is passed to hosted_vllm completion, use
openai_chat_completions.completion() to ensure the client is properly
used. This fixes test failures in test_openai_compatible_custom_api_base
and test_openai_compatible_custom_api_video for the hosted_vllm provider.

The base_llm_http_handler is still used when no custom client is passed,
preserving the ssl_verify functionality added in #19893.

Fixes: test_openai_compatible_custom_api_video[hosted_vllm]
Fixes: test_openai_compatible_custom_api_base[hosted_vllm]
@vercel
Copy link

vercel bot commented Jan 31, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Error Error Jan 31, 2026 6:01pm

Request Review

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants