Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vertex AI Model gives error on system message input and connected text input #2976

Closed
severfire opened this issue Jul 26, 2024 · 1 comment · Fixed by #2984
Closed

Vertex AI Model gives error on system message input and connected text input #2976

severfire opened this issue Jul 26, 2024 · 1 comment · Fixed by #2984
Labels
bug Something isn't working

Comments

@severfire
Copy link

Bug Description

When adding system message and input via for example prompt builder I am receiving error.

Error Message:
Error Building Component:
Error building Component Vertex AI: Expected mapping type as input to ChatPromptTemplate. Received <class 'list'>.

Reproduction

Connected Input, no System message
image

No connected Input + system message
image

Connected Input + system message
image

Expected behavior

It should not give errors

Who can help?

No response

Operating System

Windows 11

Langflow Version

1.0.12

Python Version

None

Screenshot

No response

Flow File

No response

@severfire severfire added the bug Something isn't working label Jul 26, 2024
Copy link

dosubot bot commented Jul 26, 2024

To resolve the error "Expected mapping type as input to ChatPromptTemplate. Received <class 'list'>" when adding a system message and input via the prompt builder in Vertex AI, you need to ensure that the input to ChatPromptTemplate is a dictionary (mapping type) rather than a list.

Here's a snippet of the build method for reference:

def build(
    self,
    input_value: Text,
    credentials: Optional[str],
    project: str,
    examples: Optional[List[BaseMessage]] = [],
    location: str = "us-central1",
    max_output_tokens: int = 128,
    model_name: str = "chat-bison",
    temperature: float = 0.0,
    top_k: int = 40,
    top_p: float = 0.95,
    verbose: bool = False,
    stream: bool = False,
    system_message: Optional[str] = None,
) -> Text:
    try:
        from langchain_google_vertexai import ChatVertexAI  # type: ignore
    except ImportError:
        raise ImportError(
            "To use the ChatVertexAI model, you need to install the langchain-google-vertexai package."
        )
    output = ChatVertexAI(
        credentials=credentials,
        examples=examples,
        location=location,
        max_output_tokens=max_output_tokens,
        model_name=model_name,
        project=project,
        temperature=temperature,
        top_k=top_k,
        top_p=top_p,
        verbose=verbose,
    )

    return self.get_chat_result(output, stream, input_value, system_message)

To fix the error, ensure that the input_value and system_message are passed as part of a dictionary to the ChatPromptTemplate. Here is an example of how you might modify the build method to ensure the correct types are passed:

def build(
    self,
    input_value: Text,
    credentials: Optional[str],
    project: str,
    examples: Optional[List[BaseMessage]] = [],
    location: str = "us-central1",
    max_output_tokens: int = 128,
    model_name: str = "chat-bison",
    temperature: float = 0.0,
    top_k: int = 40,
    top_p: float = 0.95,
    verbose: bool = False,
    stream: bool = False,
    system_message: Optional[str] = None,
) -> Text:
    try:
        from langchain_google_vertexai import ChatVertexAI  # type: ignore
    except ImportError:
        raise ImportError(
            "To use the ChatVertexAI model, you need to install the langchain-google-vertexai package."
        )
    output = ChatVertexAI(
        credentials=credentials,
        examples=examples,
        location=location,
        max_output_tokens=max_output_tokens,
        model_name=model_name,
        project=project,
        temperature=temperature,
        top_k=top_k,
        top_p=top_p,
        verbose=verbose,
    )

    # Ensure input_value and system_message are passed as a dictionary
    prompt_data = {
        "input_value": input_value,
        "system_message": system_message
    }

    return self.get_chat_result(output, stream, prompt_data)

In this modification, prompt_data is a dictionary containing input_value and system_message, which should resolve the type error you are encountering [1].

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant