Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PipelinePromptTemplate doesn't support partial #27868

Closed
5 tasks done
dmenini opened this issue Nov 3, 2024 · 3 comments
Closed
5 tasks done

PipelinePromptTemplate doesn't support partial #27868

dmenini opened this issue Nov 3, 2024 · 3 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@dmenini
Copy link
Contributor

dmenini commented Nov 3, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

      # Example from docs: https://python.langchain.com/v0.1/docs/modules/model_io/prompts/composition/#using-pipelineprompt
      full_template = """{introduction}
      {example}
      {start}"""
      full_prompt = PromptTemplate.from_template(full_template)
  
      introduction_template = """You are impersonating {person}."""
      introduction_prompt = PromptTemplate.from_template(introduction_template)
      
      example_template = """Here's an example of an interaction:
      Q: {example_q}
      A: {example_a}"""
      example_prompt = PromptTemplate.from_template(example_template)
  
      start_template = """Now, do this for real!
      Q: {input}
      A:"""
      start_prompt = PromptTemplate.from_template(start_template)
  
      input_prompts = [
          ("introduction", introduction_prompt),
          ("example", example_prompt),
          ("start", start_prompt),
      ]
      pipeline_prompt = PipelinePromptTemplate(
          final_prompt=full_prompt, pipeline_prompts=input_prompts
      )
  
      pipeline_prompt.partial(person="Elon Musk")

Error Message and Stack Trace (if applicable)

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../.venv/lib/python3.12/site-packages/langchain_core/prompts/base.py:277: in partial
    return type(self)(**prompt_dict)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = PipelinePromptTemplate(input_variables=['person', 'input', 'example_q', 'example_a'], input_types={}, partial_variable...ariables=['input'], input_types={}, partial_variables={}, template='Now, do this for real!\n    Q: {input}\n    A:'))])
args = ()
kwargs = {'final_prompt': PromptTemplate(input_variables=['example', 'introduction', 'start'], input_types={}, partial_variable...mple}\n    {start}'), 'input_types': {}, 'input_variables': ['input', 'example_a', 'example_q'], 'metadata': None, ...}

    def __init__(self, *args: Any, **kwargs: Any) -> None:
        """"""
>       super().__init__(*args, **kwargs)
E       pydantic_core._pydantic_core.ValidationError: 1 validation error for PipelinePromptTemplate
E         Value error, Found overlapping input and partial variables: {'person'}
E       For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/INVALID_PROMPT_INPUT [type=value_error, input_value={'name': None, 'input_var... Q: {input}\n    A:'))]}, input_type=dict]
E           For further information visit https://errors.pydantic.dev/2.9/v/value_error

../../.venv/lib/python3.12/site-packages/langchain_core/load/serializable.py:125: ValidationError

Description

In the partial method the PipelinePromptTemplate is reinstantiated with a new set of input_variables and partial_variables. However, when creating a new instance the get_input_variables model validator is invoked, which recreates the original input variables based on the pipeline_prompts, overwriting the new input_variables. Therefore the validation error related to overlapping input_variables and partial_variables.

System Info

❯ python -m langchain_core.sys_info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 23.5.0: Wed May  1 20:12:58 PDT 2024; root:xnu-10063.121.3~5/RELEASE_ARM64_T6000
> Python Version:  3.12.7 (main, Oct  1 2024, 02:05:46) [Clang 15.0.0 (clang-1500.3.9.4)]

Package Information
-------------------
> langchain_core: 0.3.15
> langchain: 0.3.7
> langsmith: 0.1.139
> langchain_aws: 0.2.6
> langchain_text_splitters: 0.3.2

Optional packages not installed
-------------------------------
> langgraph
> langserve
@dosubot dosubot bot added 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Nov 3, 2024
@Aarya2004
Copy link

Hello @dmenini! My group and I investigated this error and we found the cause of this is due to the partial method not being overridden in the PipleinePromptTemplate class. We've come up with a potential fix but if this issue can be assigned to us, we can open a PR asap!

@efriis
Copy link
Member

efriis commented Dec 10, 2024

Hey there! Agreed this is not handled well by the PipelinePromptTemplate, and it's largely because PipelinePromptTemplate doesn't adhere to the PromptTemplate interface in many ways.

I would recommend doing something like the following instead

keys_and_prompts: list[tuple[str, PromptTemplate]] = ...


chain = RunnablePassthrough().assign(partial_var_1=x, partial_var_2=y)
for key, prompt in keys_and_prompts:
  chain = chain | RunnablePassthrough.assign(**{key: prompt})

chain = chain | lambda x: x[keys_and_prompts[-1][0]] # get the output of the final prompt

chain.invoke({"a": "b"}) # all your prompt variables

@efriis efriis closed this as completed Dec 10, 2024
@efriis
Copy link
Member

efriis commented Dec 10, 2024

better one not using LCEL:

my_input = {"key": "value"}
for name, prompt in pipeline_prompts:
    my_input[name] = prompt.invoke(my_input).to_string()
my_output = final_prompt.invoke(my_input)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

3 participants