Skip to content

Commit

Permalink
Feature: Follow-up Prompts (#3280)
Browse files Browse the repository at this point in the history
* Add migrations - add follow up prompts column to chatflow and chat message

* Add configuration tab for follow-up prompts

* Add follow up prompts functionality

* Pin zod version in components - this was causing a type error with structured outputs

* Generate follow up prompts if enabled and return it in stream, response, and save to database

* Show follow up prompts after getting response

* Add google gen ai for generating follow up prompts and fix issues

* Add config for google gen ai and update model options

* Update follow-up prompts ui and styles

* Release/2.1.0 (#3204)

[email protected] release

* Chore/update flowise embed version to 2.0.0 (#3205)

* update flowise-embed version on lock file

* add agent messages to share chatbot

* Update pnpm-lock.yaml

* update flowise-embed version

* update flowise-embed to 1.3.9

* update embed version to 2.0

* Bugfix/CodeInterpreter E2B Credential (#3206)

* Base changes for ServerSide Events (instead of socket.io)

* lint fixes

* adding of interface and separate methods for streaming events

* lint

* first draft, handles both internal and external prediction end points.

* lint fixes

* additional internal end point for streaming and associated changes

* return streamresponse as true to build agent flow

* 1) JSON formatting for internal events
2) other fixes

* 1) convert internal event to metadata to maintain consistency with external response

* fix action and metadata streaming

* fix for error when agent flow is aborted

* prevent subflows from streaming and other code cleanup

* prevent streaming from enclosed tools

* add fix for preventing chaintool streaming

* update lock file

* add open when hidden to sse

* Streaming errors

* Streaming errors

* add fix for showing error message

* add code interpreter

* add artifacts to view message dialog

* Update pnpm-lock.yaml

* uncomment e2b credential

---------

Co-authored-by: Vinod Paidimarry <[email protected]>

* Release/2.1.0 (#3207)

* [email protected] release

* update [email protected]

* Bugfix/Add artifacts migration script to other database types (#3210)

add artifacts migration script to other database types

* Release/2.1.1 (#3213)

release @2.1.1

* Bugfix/Add header to allow sse on nginx (#3214)

add header to allow sse on nginx

* Bugfix/remove invalid markdown (#3219)

remove invalid markdown

* Correct "as" casing (#3216)

* Correct "as" casing

* Remove "version" line from docker compose file

* Update docker-compose.yml

---------

Co-authored-by: Henry Heng <[email protected]>

* chore: update unstructured API url and doc reference (#3224)

chore: udpate unstructured API url and doc reference

* Feature/add ability to specify dynamic metadata to jsonlines (#3238)

* add ability to specify dynamic metadata to jsonlines

* fix additional metadata

* Bugfix/Buffer Memory for Anthropic (#3242)

fix buffer memory

* Added env vars to ui and api URL  (#3141)

* feat: add environment vars to split application in different deployments for better scalability

* update: package.json

added start script ui

---------

Co-authored-by: patrick <[email protected]>

* Added 1-click deployment link for Alibaba Cloud.  (#3251)

* Added a link for Alibaba Cloud Deployment

* change service name

---------

Co-authored-by: yehan <[email protected]>

* Chore/Groq Llama3.2 (#3255)

* add gemini flash

* add gemin flash to vertex

* add gemin-1.5-flash-preview to vertex

* add azure gpt 4o

* add claude 3.5 sonnet

* add mistral nemo

* add groq llama3.1

* add gpt4o-mini to azure

* o1 mini

* add groq llama 3.2

* Bugfix/Prevent streaming of chatflow tool and chain tool (#3257)

prevent streaming of chatflow tool and chain tool

* Bugfix/Enable Custom Tool Optional Input Schema (#3258)

* prevent streaming of chatflow tool and chain tool

* enable optional input schema

* Bugfix/Searxng tool not working (#3263)

fix searxng tool not working

* LunaryAI automatic Thread and User tracking (#3233)

* Lunary Thread/User tracking

* Clean console logs

* Clean

* Remove commented lines

* Remove commented line

* feat: enable autofocus to the `new chatflow title` to improve usability (#3260)

This dialog has only one input and it is the primary one, there is no need for an extra click to be able to set the title

* feat: save a new Chatflow when the `ENTER` key is pressed (#3261)

This simple event handler improve the usability of the UI by avoiding having to use the mouse or having to tab twice and then hit enter to save a flow

* feat: save Chatflow title when the `ENTER` key is pressed or discard upon `ESC` is pressed (#3265)

This simple event handler improves the usability of the UI by avoiding having to use the mouse to save or dicard title changes

* feat: enable autofocus to the `edit chatflow title` field to improve UI usability (#3264)

feat: enable autofocus to the `edit chatflow title` field to improve usability

The canvas header has only one input and it is the primary one, there is no need for an extra click to be able to edit the title

* feat: add search keyboard shortcut based on the current platform (#3267)

* feat: highlight valid/invalid connection between nodes (#3266)

Change the inputs background to green/red to hint compatible connections, in adition to the `not-allowed` mouse cursor for incompatible connections

* Bugfix/add fixes for search of view header (#3271)

add fixes for search of view header

* fix: warning when passing a boolean to border property of a Card (#3275)

By default MainCard wrappers like NodeCardWrapper and CardWrapper add a a solid border of 1px, but if the `MainCard.border` prop is used (`false`) the border prop was wrongly set to a boolean instead of string

* feat: add shortcut text hint to the search field (#3269)

* feat: add shortcut text hint to the search field

* fix: search box width to fit the shortcut hint text

* fix: error when not running on Mac due to an undefined `os` variable

* fix: warning when a non-boolean values was used to set `checked` prop of a SwitchInput component (#3276)

fix: warning when a non-boolean values was used to set`checked` prop of SwitchInput component

The problem was that in the useEffect hook the plain value was used without validation like in useState

* Bugfix/Throw error to prevent SSE from retrying (#3281)

throw error to prevent SSE from retrying

* Pin zod version in components - this was causing a type error with structured outputs

* Fix conflicts in pnpm lock

* fix ui changes for follow up prompts

* Fix button disable state in follow-up prompts configuration

* Fix follow-up prompts not showing up for agent flows

* Show follow up prompts if last message is apiMessage and follow up prompts are available

---------

Co-authored-by: Henry Heng <[email protected]>
Co-authored-by: Vinod Paidimarry <[email protected]>
Co-authored-by: Cross <[email protected]>
Co-authored-by: cragwolfe <[email protected]>
Co-authored-by: patrickreinan <[email protected]>
Co-authored-by: patrick <[email protected]>
Co-authored-by: yehan <[email protected]>
Co-authored-by: yehan <[email protected]>
Co-authored-by: Vincelwt <[email protected]>
Co-authored-by: Humberto Rodríguez A. <[email protected]>
Co-authored-by: Henry <[email protected]>
  • Loading branch information
12 people authored Oct 4, 2024
1 parent 4908557 commit c9d8b87
Show file tree
Hide file tree
Showing 25 changed files with 1,076 additions and 203 deletions.
2 changes: 1 addition & 1 deletion packages/components/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
"weaviate-ts-client": "^1.1.0",
"winston": "^3.9.0",
"ws": "^8.9.0",
"zod": "^3.22.4",
"zod": "3.22.4",
"zod-to-json-schema": "^3.21.4"
},
"devDependencies": {
Expand Down
22 changes: 22 additions & 0 deletions packages/components/src/Interface.ts
Original file line number Diff line number Diff line change
Expand Up @@ -419,3 +419,25 @@ export interface IServerSideEventStreamer {
streamAbortEvent(chatId: string): void
streamEndEvent(chatId: string): void
}

export enum FollowUpPromptProvider {
ANTHROPIC = 'chatAnthropic',
AZURE_OPENAI = 'azureChatOpenAI',
GOOGLE_GENAI = 'chatGoogleGenerativeAI',
MISTRALAI = 'chatMistralAI',
OPENAI = 'chatOpenAI'
}

export type FollowUpPromptProviderConfig = {
[key in FollowUpPromptProvider]: {
credentialId: string
modelName: string
prompt: string
temperature: string
}
}

export type FollowUpPromptConfig = {
status: boolean
selectedProvider: FollowUpPromptProvider
} & FollowUpPromptProviderConfig
113 changes: 113 additions & 0 deletions packages/components/src/followUpPrompts.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
import { FollowUpPromptConfig, FollowUpPromptProvider, ICommonObject } from './Interface'
import { getCredentialData } from './utils'
import { ChatAnthropic } from '@langchain/anthropic'
import { ChatGoogleGenerativeAI } from '@langchain/google-genai'
import { ChatMistralAI } from '@langchain/mistralai'
import { ChatOpenAI } from '@langchain/openai'
import { z } from 'zod'
import { PromptTemplate } from '@langchain/core/prompts'
import { StructuredOutputParser } from '@langchain/core/output_parsers'

const FollowUpPromptType = z
.object({
questions: z.array(z.string())
})
.describe('Generate Follow Up Prompts')

export const generateFollowUpPrompts = async (
followUpPromptsConfig: FollowUpPromptConfig,
apiMessageContent: string,
options: ICommonObject
) => {
if (followUpPromptsConfig) {
const providerConfig = followUpPromptsConfig[followUpPromptsConfig.selectedProvider]
const credentialId = providerConfig.credentialId as string
const credentialData = await getCredentialData(credentialId ?? '', options)
const followUpPromptsPrompt = providerConfig.prompt.replace('{history}', apiMessageContent)

switch (followUpPromptsConfig.selectedProvider) {
case FollowUpPromptProvider.ANTHROPIC: {
const llm = new ChatAnthropic({
apiKey: credentialData.anthropicApiKey,
model: providerConfig.modelName,
temperature: parseFloat(`${providerConfig.temperature}`)
})
const structuredLLM = llm.withStructuredOutput(FollowUpPromptType)
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
return structuredResponse
}
case FollowUpPromptProvider.AZURE_OPENAI: {
const azureOpenAIApiKey = credentialData['azureOpenAIApiKey']
const azureOpenAIApiInstanceName = credentialData['azureOpenAIApiInstanceName']
const azureOpenAIApiDeploymentName = credentialData['azureOpenAIApiDeploymentName']
const azureOpenAIApiVersion = credentialData['azureOpenAIApiVersion']

const llm = new ChatOpenAI({
azureOpenAIApiKey,
azureOpenAIApiInstanceName,
azureOpenAIApiDeploymentName,
azureOpenAIApiVersion,
model: providerConfig.modelName,
temperature: parseFloat(`${providerConfig.temperature}`)
})
// use structured output parser because withStructuredOutput is not working
const parser = StructuredOutputParser.fromZodSchema(FollowUpPromptType)
const formatInstructions = parser.getFormatInstructions()
const prompt = PromptTemplate.fromTemplate(`
${providerConfig.prompt}
{format_instructions}
`)
const chain = prompt.pipe(llm).pipe(parser)
const structuredResponse = await chain.invoke({
history: apiMessageContent,
format_instructions: formatInstructions
})
return structuredResponse
}
case FollowUpPromptProvider.GOOGLE_GENAI: {
const llm = new ChatGoogleGenerativeAI({
apiKey: credentialData.googleGenerativeAPIKey,
model: providerConfig.modelName,
temperature: parseFloat(`${providerConfig.temperature}`)
})
// use structured output parser because withStructuredOutput is not working
const parser = StructuredOutputParser.fromZodSchema(FollowUpPromptType)
const formatInstructions = parser.getFormatInstructions()
const prompt = PromptTemplate.fromTemplate(`
${providerConfig.prompt}
{format_instructions}
`)
const chain = prompt.pipe(llm).pipe(parser)
const structuredResponse = await chain.invoke({
history: apiMessageContent,
format_instructions: formatInstructions
})
return structuredResponse
}
case FollowUpPromptProvider.MISTRALAI: {
const model = new ChatMistralAI({
apiKey: credentialData.mistralAIAPIKey,
model: providerConfig.modelName,
temperature: parseFloat(`${providerConfig.temperature}`)
})
const structuredLLM = model.withStructuredOutput(FollowUpPromptType)
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
return structuredResponse
}
case FollowUpPromptProvider.OPENAI: {
const model = new ChatOpenAI({
apiKey: credentialData.openAIApiKey,
model: providerConfig.modelName,
temperature: parseFloat(`${providerConfig.temperature}`)
})
const structuredLLM = model.withStructuredOutput(FollowUpPromptType)
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
return structuredResponse
}
}
} else {
return undefined
}
}
1 change: 1 addition & 0 deletions packages/components/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ export * from './utils'
export * from './speechToText'
export * from './storageUtils'
export * from './handler'
export * from './followUpPrompts'
2 changes: 2 additions & 0 deletions packages/server/src/Interface.ts
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ export interface IChatFlow {
apikeyid?: string
analytic?: string
chatbotConfig?: string
followUpPrompts?: string
apiConfig?: string
category?: string
type?: ChatflowType
Expand All @@ -50,6 +51,7 @@ export interface IChatMessage {
createdDate: Date
leadEmail?: string
action?: string | null
followUpPrompts?: string
}

export interface IChatMessageFeedback {
Expand Down
3 changes: 3 additions & 0 deletions packages/server/src/database/entities/ChatFlow.ts
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ export class ChatFlow implements IChatFlow {
@Column({ nullable: true, type: 'text' })
speechToText?: string

@Column({ nullable: true, type: 'text' })
followUpPrompts?: string

@Column({ nullable: true, type: 'text' })
category?: string

Expand Down
3 changes: 3 additions & 0 deletions packages/server/src/database/entities/ChatMessage.ts
Original file line number Diff line number Diff line change
Expand Up @@ -56,4 +56,7 @@ export class ChatMessage implements IChatMessage {

@Column({ nullable: true, type: 'text' })
leadEmail?: string

@Column({ nullable: true, type: 'text' })
followUpPrompts?: string
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import { MigrationInterface, QueryRunner } from 'typeorm'

export class AddFollowUpPrompts1726666318346 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
const columnExistsInChatflow = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
if (!columnExistsInChatflow) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
const columnExistsInChatMessage = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
if (!columnExistsInChatMessage) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
}

public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE \`chat_flow\` DROP COLUMN \`followUpPrompts\`;`)
}
}
4 changes: 3 additions & 1 deletion packages/server/src/database/migrations/mariadb/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionTo
import { LongTextColumn1722301395521 } from './1722301395521-LongTextColumn'
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
import { AddFollowUpPrompts1726666318346 } from './1726666318346-AddFollowUpPrompts'

export const mariadbMigrations = [
Init1693840429259,
Expand Down Expand Up @@ -53,5 +54,6 @@ export const mariadbMigrations = [
AddActionToChatMessage1721078251523,
LongTextColumn1722301395521,
AddCustomTemplate1725629836652,
AddArtifactsToChatMessage1726156258465
AddArtifactsToChatMessage1726156258465,
AddFollowUpPrompts1726666318346
]
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import { MigrationInterface, QueryRunner } from 'typeorm'

export class AddFollowUpPrompts1726666302024 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
const columnExistsInChatflow = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
if (!columnExistsInChatflow) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
const columnExistsInChatMessage = await queryRunner.hasColumn('chat_message', 'followUpPrompts')
if (!columnExistsInChatMessage) queryRunner.query(`ALTER TABLE \`chat_message\` ADD COLUMN \`followUpPrompts\` TEXT;`)
}

public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE \`chat_flow\` DROP COLUMN \`followUpPrompts\`;`)
await queryRunner.query(`ALTER TABLE \`chat_message\` DROP COLUMN \`followUpPrompts\`;`)
}
}
4 changes: 3 additions & 1 deletion packages/server/src/database/migrations/mysql/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionTo
import { LongTextColumn1722301395521 } from './1722301395521-LongTextColumn'
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
import { AddFollowUpPrompts1726666302024 } from './1726666302024-AddFollowUpPrompts'

export const mysqlMigrations = [
Init1693840429259,
Expand Down Expand Up @@ -55,5 +56,6 @@ export const mysqlMigrations = [
AddActionToChatMessage1721078251523,
LongTextColumn1722301395521,
AddCustomTemplate1725629836652,
AddArtifactsToChatMessage1726156258465
AddArtifactsToChatMessage1726156258465,
AddFollowUpPrompts1726666302024
]
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import { MigrationInterface, QueryRunner } from 'typeorm'

export class AddFollowUpPrompts1726666309552 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE "chat_flow" ADD COLUMN IF NOT EXISTS "followUpPrompts" TEXT;`)
await queryRunner.query(`ALTER TABLE "chat_message" ADD COLUMN IF NOT EXISTS "followUpPrompts" TEXT;`)
}

public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE "chat_flow" DROP COLUMN "followUpPrompts";`)
await queryRunner.query(`ALTER TABLE "chat_message" DROP COLUMN "followUpPrompts";`)
}
}
4 changes: 3 additions & 1 deletion packages/server/src/database/migrations/postgres/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ import { AddApiKey1720230151480 } from './1720230151480-AddApiKey'
import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionToChatMessage'
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
import { AddFollowUpPrompts1726666309552 } from './1726666309552-AddFollowUpPrompts'

export const postgresMigrations = [
Init1693891895163,
Expand Down Expand Up @@ -55,5 +56,6 @@ export const postgresMigrations = [
AddApiKey1720230151480,
AddActionToChatMessage1721078251523,
AddCustomTemplate1725629836652,
AddArtifactsToChatMessage1726156258465
AddArtifactsToChatMessage1726156258465,
AddFollowUpPrompts1726666309552
]
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import { MigrationInterface, QueryRunner } from 'typeorm'

export class AddFollowUpPrompts1726666294213 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE "chat_flow" ADD COLUMN "followUpPrompts" TEXT;`)
await queryRunner.query(`ALTER TABLE "chat_message" ADD COLUMN "followUpPrompts" TEXT;`)
}

public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE "chat_flow" DROP COLUMN "followUpPrompts";`)
await queryRunner.query(`ALTER TABLE "chat_message" DROP COLUMN "followUpPrompts";`)
}
}
4 changes: 3 additions & 1 deletion packages/server/src/database/migrations/sqlite/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ import { AddApiKey1720230151480 } from './1720230151480-AddApiKey'
import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionToChatMessage'
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
import { AddFollowUpPrompts1726666294213 } from './1726666294213-AddFollowUpPrompts'

export const sqliteMigrations = [
Init1693835579790,
Expand Down Expand Up @@ -53,5 +54,6 @@ export const sqliteMigrations = [
AddApiKey1720230151480,
AddActionToChatMessage1721078251523,
AddArtifactsToChatMessage1726156258465,
AddCustomTemplate1725629836652
AddCustomTemplate1725629836652,
AddFollowUpPrompts1726666294213
]
3 changes: 3 additions & 0 deletions packages/server/src/utils/SSEStreamer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -211,6 +211,9 @@ export class SSEStreamer implements IServerSideEventStreamer {
if (apiResponse.memoryType) {
metadataJson['memoryType'] = apiResponse.memoryType
}
if (apiResponse.followUpPrompts) {
metadataJson['followUpPrompts'] = JSON.parse(apiResponse.followUpPrompts)
}
if (Object.keys(metadataJson).length > 0) {
this.streamCustomEvent(chatId, 'metadata', metadataJson)
}
Expand Down
27 changes: 27 additions & 0 deletions packages/server/src/utils/buildChatflow.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import {
addArrayFilesToStorage,
mapMimeTypeToInputField,
mapExtToInputField,
generateFollowUpPrompts,
IServerSideEventStreamer
} from 'flowise-components'
import { StatusCodes } from 'http-status-codes'
Expand Down Expand Up @@ -452,6 +453,18 @@ export const utilBuildChatflow = async (req: Request, isInternal: boolean = fals
if (result?.usedTools) apiMessage.usedTools = JSON.stringify(result.usedTools)
if (result?.fileAnnotations) apiMessage.fileAnnotations = JSON.stringify(result.fileAnnotations)
if (result?.artifacts) apiMessage.artifacts = JSON.stringify(result.artifacts)
if (chatflow.followUpPrompts) {
const followUpPromptsConfig = JSON.parse(chatflow.followUpPrompts)
const followUpPrompts = await generateFollowUpPrompts(followUpPromptsConfig, apiMessage.content, {
chatId,
chatflowid,
appDataSource: appServer.AppDataSource,
databaseEntities
})
if (followUpPrompts?.questions) {
apiMessage.followUpPrompts = JSON.stringify(followUpPrompts.questions)
}
}

const chatMessage = await utilAddChatMessage(apiMessage)

Expand All @@ -470,6 +483,7 @@ export const utilBuildChatflow = async (req: Request, isInternal: boolean = fals
result.question = incomingInput.question
result.chatId = chatId
result.chatMessageId = chatMessage?.id
result.followUpPrompts = JSON.stringify(apiMessage.followUpPrompts)
result.isStreamValid = isStreamValid

if (sessionId) result.sessionId = sessionId
Expand Down Expand Up @@ -543,6 +557,18 @@ const utilBuildAgentResponse = async (
if (usedTools?.length) apiMessage.usedTools = JSON.stringify(usedTools)
if (agentReasoning?.length) apiMessage.agentReasoning = JSON.stringify(agentReasoning)
if (finalAction && Object.keys(finalAction).length) apiMessage.action = JSON.stringify(finalAction)
if (agentflow.followUpPrompts) {
const followUpPromptsConfig = JSON.parse(agentflow.followUpPrompts)
const generatedFollowUpPrompts = await generateFollowUpPrompts(followUpPromptsConfig, apiMessage.content, {
chatId,
chatflowid: agentflow.id,
appDataSource: appServer.AppDataSource,
databaseEntities
})
if (generatedFollowUpPrompts?.questions) {
apiMessage.followUpPrompts = JSON.stringify(generatedFollowUpPrompts.questions)
}
}
const chatMessage = await utilAddChatMessage(apiMessage)

await appServer.telemetry.sendTelemetry('agentflow_prediction_sent', {
Expand Down Expand Up @@ -591,6 +617,7 @@ const utilBuildAgentResponse = async (
if (memoryType) result.memoryType = memoryType
if (agentReasoning?.length) result.agentReasoning = agentReasoning
if (finalAction && Object.keys(finalAction).length) result.action = finalAction
result.followUpPrompts = JSON.stringify(apiMessage.followUpPrompts)

return result
}
Expand Down
1 change: 1 addition & 0 deletions packages/ui/src/assets/images/anthropic.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions packages/ui/src/assets/images/azure_openai.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit c9d8b87

Please sign in to comment.