Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pydantic validation error on angchain_community.chat_models.ChatLiteLLMRouter #27455

Open
5 tasks done
pppazos opened this issue Oct 18, 2024 · 5 comments
Open
5 tasks done
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@pppazos
Copy link

pppazos commented Oct 18, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

ChatLiteLLMRouter

from langchain_core.prompts import ChatPromptTemplate
from litellm.router import Router
from langchain_community.chat_models import ChatLiteLLMRouter

_prompt = ChatPromptTemplate.from_messages(
    [
        ("human", "You are an asistant. {input}"),
    ]
)


model_list = [
    {
        "model_name": "claude_3_haiku",
        "litellm_params": {
            "model": "bedrock/anthropic.claude-3-haiku-20240307-v1:0",
        },
    },
    {
        "model_name": "claude_3_sonnet",
        "litellm_params": {
            "model": "bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
        },
    },
]

litellm_router = Router(model_list=model_list)
_modellm=ChatLiteLLMRouter(router=litellm_router)

Error Message and Stack Trace (if applicable)

pydantic_core._pydantic_core.ValidationError: 1 validation error for ChatLiteLLMRouter
router
Field required [type=missing, input_value={'name': None, 'cache': N...ogether_ai_api_key': ''}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.9/v/missing

Description

I'm trying to create an instance of ChatLiteLLMRouter object, passing the required parameter router as:

_modellm=ChatLiteLLMRouter(router=litellm_router)

expected: the ChatLiteLLMRouter object is created
actual: exception raised because do not detect router parameter being passed:

pydantic_core._pydantic_core.ValidationError: 1 validation error for ChatLiteLLMRouter
router
Field required [type=missing, input_value={'name': None, 'cache': N...ogether_ai_api_key': ''}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.9/v/missing

System Info

System Information

OS: Linux
OS Version: #1 SMP Wed Mar 2 00:30:59 UTC 2022
Python Version: 3.11.10 (main, Sep 7 2024, 18:35:41) [GCC 11.4.0]

Package Information

langchain_core: 0.3.12
langchain: 0.3.3
langchain_community: 0.3.2
langsmith: 0.1.136
langchain_aws: 0.2.2
langchain_cli: 0.0.31
langchain_text_splitters: 0.3.0
langserve: 0.3.0

Optional packages not installed

langgraph

Other Dependencies

aiohttp: 3.10.10
async-timeout: Installed. No version info available.
boto3: 1.35.43
dataclasses-json: 0.6.7
fastapi: 0.115.2
gitpython: 3.1.43
gritql: 0.1.5
httpx: 0.27.2
jsonpatch: 1.33
langserve[all]: Installed. No version info available.
numpy: 1.26.4
orjson: 3.10.7
packaging: 24.1
pydantic: 2.9.2
pydantic-settings: 2.6.0
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
SQLAlchemy: 2.0.36
sse-starlette: 1.8.2
tenacity: 8.5.0
tomlkit: 0.12.5
typer[all]: Installed. No version info available.
typing-extensions: 4.12.2
uvicorn: 0.23.2

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Oct 18, 2024
@pppazos
Copy link
Author

pppazos commented Oct 18, 2024

It works as expected for me when removing the constructor in the class:

class ChatLiteLLMRouter(ChatLiteLLM):
    """LiteLLM Router as LangChain Model."""

    router: Any

    #def __init__(self, *, router: Any, **kwargs: Any) -> None:
    #   """Construct Chat LiteLLM Router."""
    #  super().__init__(**kwargs)
    #  self.router = router

@Aarya2004
Copy link

Hello @pppazos! We're a group of students from the University of Toronto, and we're interested in investigating this issue further and making a PR to fix it. Do you have any additional information regarding this issue?

While we were able to replicate it, we didn't get the exact error message but it seems close to the error you got. I have attached an image of the error we got below and would appreciate if you could review it.
Screenshot 2024-10-21 at 8 51 09 PM

If you have no extra information, we can begin working on this as soon as you confirm and will have a PR ready by mid-November.

@pppazos
Copy link
Author

pppazos commented Oct 22, 2024

If you have no extra information, we can begin working on this as soon as you confirm and will have a PR ready by mid-November.

Thanks @Aarya2004 , No additional info on my side.

duodecanol added a commit to duodecanol/langchain that referenced this issue Oct 22, 2024
@kjoth
Copy link

kjoth commented Nov 13, 2024

Hi @pppazos @Aarya2004 @duodecanol @bburgin @baskaryan @mackong

The routing for a specific model is also not functioning.

We have configured three models here, small, medium and large.

model_list = [ { "model_name": "small", "litellm_params": { "model": "bedrock/mistral.mistral-small-2402-v1:0", }, "metadata": { "description": "Meta's LLaMA 2 model for multi-turn conversations", } }, { "model_name": "medium", "litellm_params": { "model": "bedrock/mistral.mistral-7b-instruct-v0:2", }, "metadata": { "description": "Meta's LLaMA 2 model for multi-turn conversations", } }, { "model_name": "large", "litellm_params": { "model": "bedrock/mistral.mixtral-8x7b-instruct-v0:1", }, "metadata": { "description": "Mistral for SQL generation and complex tasks", } } ]

router = Router(model_list=model_list)
chat = ChatLiteLLMRouter( model_name="large",router=router)

chat.invoke("Capital of india")

Instead of using large model mistral.mixtral-8x7b-instruct-v0:1, its using small model mistral.mistral-small-2402-v1:0.

image

Even in the source code, its selecting the first model from the list, instead of selecting user passed model.

image

@kjoth
Copy link

kjoth commented Nov 13, 2024

The issue has been reported earlier, but it has been closed by GitHub BOT.

#19356

ccurme pushed a commit that referenced this issue Dec 16, 2024
**Description:** Fix ChatLiteLLMRouter ctor validation and model_name
parameter
**Issue:** #19356, #27455, #28077
**Twitter handle:** @bburgin_0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

3 participants