Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add conversation history selection options to ConditionAgent node #3719

Conversation

jeanibarz
Copy link
Contributor

Enhance the ConditionAgent node by adding conversation history selection options, aligning it with the existing Agent and LLMNode nodes. This feature allows users to filter the conversation history messages used in the ConditionAgent, providing greater control over the context included in prompts.

Changes

  • Added Parameter:

    • conversationHistorySelection: Allows users to choose which messages from the conversation history to include in prompts.
  • Selection Options:

    • User Question: Use the user question from the historical conversation messages as input.
    • Last Conversation Message: Use the last conversation message from the historical conversation messages as input.
    • All Conversation Messages: Use all conversation messages from the historical conversation messages as input.
    • Empty: Do not use any messages from the conversation history. Ensure to use either System Prompt, Human Prompt, or Messages History.
  • Default Selection:

    • Set to 'All Conversation Messages' to maintain comprehensive context by default.

Related Issues

- Added a new parameter `conversationHistorySelection` to allow users to choose which messages from the conversation history to include in prompts.
- Options include: User Question, Last Conversation Message, All Conversation Messages, and Empty.
- Default selection is set to 'All Conversation Messages' for improved context management in sequential LLM and Agent nodes.
@@ -185,6 +187,42 @@ class ConditionAgent_SeqAgents implements INode {
additionalParams: true,
optional: true
},
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @jeanibarz ! can we increment the this.version = 2.0 to 3.0, and have the parameter order as the same in Agent/LLMNode:

  1. System Prompt
  2. Prepend Messages History
  3. Conversation History
  4. Human Prompt

@HenryHengZJ
Copy link
Contributor

there was some linting issue from the main branch, can you pull the latest changes?

@jeanibarz
Copy link
Contributor Author

jeanibarz commented Dec 18, 2024

Yes I noticed that yersteday: done 👍

Copy link
Contributor

@HenryHengZJ HenryHengZJ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @jeanibarz !

@HenryHengZJ HenryHengZJ merged commit c809f41 into FlowiseAI:main Dec 18, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants