Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ChatLiteLLM] litellm.UnsupportedParamsError: VertexAI doesn't support tool_choice=any. Supported tool_choice values=['auto', 'required', json object] #29308

Open
5 tasks done
j0yk1ll opened this issue Jan 20, 2025 · 0 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@j0yk1ll
Copy link

j0yk1ll commented Jan 20, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import os
from langchain_community.chat_models import ChatLiteLLM
from langchain_google_genai import ChatGoogleGenerativeAI

from typing import Optional

from pydantic import BaseModel, Field


class Joke(BaseModel):
    """Joke to tell user."""

    setup: str = Field(description="The setup of the joke")
    punchline: str = Field(description="The punchline to the joke")
    rating: Optional[int] = Field(
        default=None, description="How funny the joke is, from 1 to 10"
    )


llm_litellm = ChatLiteLLM(
    model="gemini/gemini-2.0-flash-exp",
    api_key=os.getenv("API_KEY"),
)

stuctured_litellm = llm_litellm.with_structured_output(Joke, include_raw=True)

llm_gemini = ChatGoogleGenerativeAI(
    model="gemini-2.0-flash-exp",
    api_key=os.getenv("API_KEY"),
)

structured_gemini = llm_gemini.with_structured_output(Joke, include_raw=True)

try:
    response_litellm = stuctured_litellm.invoke("Tell me a joke about cats")
    print(response_litellm)
except Exception as e:
    print(f"An error occured for litellm: {str(e)}")
    print()

try:
    response_gemini = structured_gemini.invoke("Tell me a joke about cats")
    print(response_gemini)
except Exception as e:
    print(f"An error occured for gemini: {str(e)}")
    print()

Error Message and Stack Trace (if applicable)

An error occured for litellm: litellm.UnsupportedParamsError: VertexAI doesn't support tool_choice=any. Supported tool_choice values=['auto', 'required', json object]. To drop it from the call, set `litellm.drop_params = True.

{'raw': AIMessage(content='', additional_kwargs={'function_call': {'name': 'Joke', 'arguments': '{"punchline": "They're always feline good!", "setup": "Why don't cats play poker in the wild?"}'}}, response_metadata={'prompt_feedback': {'block_reason': 0, 'safety_ratings': []}, 'finish_reason': 'STOP', 'safety_ratings': [{'category': 'HARM_CATEGORY_HATE_SPEECH', 'probability': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_DANGEROUS_CONTENT', 'probability': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_HARASSMENT', 'probability': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_SEXUALLY_EXPLICIT', 'probability': 'NEGLIGIBLE', 'blocked': False}]}, id='run-1adbaf9b-ea9f-47f4-bbce-2fd6e93b9a65-0', tool_calls=[{'name': 'Joke', 'args': {'punchline': "They're always feline good!", 'setup': "Why don't cats play poker in the wild?"}, 'id': 'ad13c4ea-ec15-45e6-9c87-91487ed657c5', 'type': 'tool_call'}], usage_metadata={'input_tokens': 91, 'output_tokens': 22, 'total_tokens': 113, 'input_token_details': {'cache_read': 0}}), 'parsed': Joke(setup="Why don't cats play poker in the wild?", punchline="They're always feline good!", rating=None), 'parsing_error': None}

Description

As one can see, it works with ChatGoogleGenerativeAI but it does not work with ChatLiteLLM.

System Info

System Information

OS: Linux
OS Version: #202405300957173214176822.04~f2697e1 SMP PREEMPT_DYNAMIC Wed N
Python Version: 3.12.7 (main, Oct 16 2024, 04:37:19) [Clang 18.1.8 ]

Package Information

langchain_core: 0.3.29
langchain: 0.3.14
langchain_community: 0.3.14
langsmith: 0.2.10
langchain_anthropic: 0.3.1
langchain_aws: 0.2.10
langchain_fireworks: 0.2.6
langchain_google_genai: 2.0.8
langchain_openai: 0.3.0
langchain_text_splitters: 0.3.5

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.11.11
anthropic: 0.42.0
async-timeout: Installed. No version info available.
boto3: 1.35.97
dataclasses-json: 0.6.7
defusedxml: 0.7.1
filetype: 1.2.0
fireworks-ai: 0.15.11
google-generativeai: 0.8.3
httpx: 0.27.2
httpx-sse: 0.4.0
jsonpatch: 1.33
langsmith-pyo3: Installed. No version info available.
numpy: 2.2.1
openai: 1.59.6
orjson: 3.10.14
packaging: 24.2
pydantic: 2.10.5
pydantic-settings: 2.7.1
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
SQLAlchemy: 2.0.37
tenacity: 9.0.0
tiktoken: 0.8.0
typing-extensions: 4.12.2
zstandard: Installed. No version info available.

@j0yk1ll j0yk1ll changed the title [] litellm.UnsupportedParamsError: VertexAI doesn't support tool_choice=any. Supported tool_choice values=['auto', 'required', json object] [ChatLiteLLM] litellm.UnsupportedParamsError: VertexAI doesn't support tool_choice=any. Supported tool_choice values=['auto', 'required', json object] Jan 20, 2025
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jan 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant