-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add LangChain Hub Component #2990
feat: add LangChain Hub Component #2990
Conversation
This pull request is automatically being deployed by Amplify Hosting (learn more). |
Hey @erichare I think the Langchain Hub integration should be a Prompt component that deals with as many messages the |
Ahhhh you know that was conceptually different than I was thinking but makes perfect sense. I’ll take a stab at updating it!
|
@ogabrielluiz @zee229 does something like this make sense? class LangSmithPromptComponent(PromptComponent):
display_name: str = "LangSmith Prompt Component"
description: str = "Prompt Component that uses LangSmith prompts"
beta = True
icon = "prompts"
trace_type = "prompt"
name = "LangSmith Prompt"
inputs = PromptComponent._base_inputs + [
SecretStrInput(
name="langchain_api_key",
display_name="Your LangChain API Key",
info="The LangChain API Key to use.",
),
StrInput(
name="langsmith_prompt",
display_name="LangSmith Prompt",
info="The LangSmith prompt to use.",
value="efriis/my-first-prompt",
),
]
outputs = [
Output(display_name="Prompt Message", name="prompt", method="build_prompt"),
]
def build_prompt(
self,
) -> Message:
# Pull the prompt from LangChain Hub
prompt_data = hub.pull(self.langsmith_prompt)
# Extract the messages from the prompt data
message_list = []
for message_data in prompt_data.messages:
message_list.append(message_data.prompt)
# Create a Message object from the messages
messages = Message(messages=message_list)
# Set the status to the messages
self.status = str(messages)
return messages I wasn't sure exactly how we would dynamically create inputs from the prompt template... here i pull the prompt template, then build the prompt based on all the messages - but, based on the comment of exposing them all as inputs, i wasn't sure of the best way - if they have to input a prompt template from langchain hub, is there a way to then create new inputs dynamically based on the variables? |
We're currently focusing on this part. We discovered that the front end refreshes for the standard prompt component interface when the user clicks the "Check & Save" button in the edit template modal. Before we start developing our own methods to refresh this frontend component, hitting this button uses the |
Perfect. keep me posted on all that @pward17, and thanks for the feedback both @zee229 and @ogabrielluiz - still trying to get my feet wet in how this all operates so this is super valuable to me. |
Maybe we should all get on discord sometime and talk about these details to make it easier. |
That would be fantastic |
One important part is that this component is to load prompts not to edit them. Creating a Component that has more than one PromptInput is a bit of a pain |
@ogabrielluiz Why do we need PromptInput here at all? If all messages are stored in Prompt Hub, and we just need to get them and add them as input fields. |
@erichare @ogabrielluiz hey guys, any progress with it? |
@erichare @ogabrielluiz hello, hows it going? do you have any progress with this feature? |
Hey @erichare Have you checked |
@ogabrielluiz @erichare I think it is not necessary, because after pull from prompt hub you get ChatPromptTemplate object and it can be used to create an agent (it has all necessary messages and so on). The main difficulty is to update the frontend node when pulling the prompt (to display input variables). |
Sorry for the delay on my end, i'm actually out this week, but I will pick it back up on Monday and update ASAP! |
@ogabrielluiz @erichare @pward17 Hey, any chance you guys can schedule a meeting soon? |
Yes! I'm back from the time off, let me know what day works best for you all, i'm very flexible this week |
Hey, what about Thursday morning? |
Perfect for me. I would be available anytime between 7am and 9am PDT, then again starting at 10am PDT |
Okay, lets meet in 10am PDT. Our emails are [email protected] , [email protected] |
LGTM! |
fdf951f
to
15b47ad
Compare
This PR adds support for a component which looks through the LangSmith Hub for prompt templates, allowing the user to input an API key and a particular prompt template to use.