[Search] [Playground]Improve follow up question flow#189848
[Search] [Playground]Improve follow up question flow#189848saarikabhasi merged 7 commits intoelastic:mainfrom
Conversation
| }); | ||
| data.appendMessageAnnotation({ | ||
| type: 'search_query', | ||
| question, |
There was a problem hiding this comment.
This needs a tweak - the question const here is always the last message submission which isn't the question used for the search. We use the LLM model to transform the question when theres more than 1 message.
You want to modify the chain to capture the question. The runnable sequence runs each function in succession.
For example:
const answerChain = RunnableSequence.from([
{
context: RunnableSequence.from([(input) => input.question, retrievalChain]),
question: (input) => input.question,
},
RunnableLambda.from((inputs) => {
data.appendMessageAnnotation({
type: 'search_query',
question: inputs.question,
});
return inputs;
}),
RunnableLambda.from(clipContext(this.options?.rag?.inputTokensLimit, prompt, data)),
RunnableLambda.from(registerContextTokenCounts(data)),
prompt,
this.options.model.withConfig({ metadata: { type: 'question_answer_qa' } }),
]);
| defaultMessage='Searching "{question}"' | ||
| values={{ question: inputTokens.searchQuery }} | ||
| /> | ||
| {` `} |
There was a problem hiding this comment.
Replaced with adding a space at the end of sentence in 4ab7ae9. With this I think we can avoid adding extra space in next line but rather included as part of sentence?
There was a problem hiding this comment.
The challenge here is if the translation strings starts to remove the space
| question: (input) => input.question, | ||
| }, | ||
| RunnableLambda.from((inputs) => { | ||
| console.log("inputs.question",inputs.question) |
|
|
||
| const chatRequest: ChatRequest = { | ||
| messages: messagesRef.current, | ||
| messages: messagesRef.current.slice(0, messagesRef.current.length-1), |
There was a problem hiding this comment.
could you double check what happens with errors.
Example:
- send a message
- get AI response
- update Gen ai API key to be invalid
- send a message
- click re-generate
I think you need to remove system / ai recent messages upto the last human message. This means discarding all system or assistant messages after the last human message.
| id="xpack.searchPlayground.chat.message.assistant.retrievalDocs" | ||
| defaultMessage="Grounding answer based on" | ||
| id="xpack.searchPlayground.chat.message.assistant.searchingQuestion" | ||
| defaultMessage='Searching "{question}"' |
There was a problem hiding this comment.
@leemthompo what do you think about this message? Should this be something like "Searching for {question} in data sources" instead?
There was a problem hiding this comment.
@joemcelroy I could see where this just says "Searching..." without printing the rewritten query at all, might be cleaner
There was a problem hiding this comment.
otherwise, yeh "Searching for" makes sense. Don't know if you need "in data sources" as sentence will get pretty long for longer queries
There was a problem hiding this comment.
i think the reason why we want to provide visibility of the query thats actually been used to search for. The LLM here has transformed the conversation into a query so useful to know.
There was a problem hiding this comment.
@joemcelroy might be nice to have two bullets like I mentioned to first establish the transformed query
There was a problem hiding this comment.
let me pick this up with Julian. Il approve for now not to block Saarika
|
@elasticmachine merge upstream |
💛 Build succeeded, but was flaky
Failed CI StepsMetrics [docs]Async chunks
History
To update your PR or re-run it, just comment with: |



Summary
Adds the query used for searching. This improves visibility to user on the query using for follow up questions.
Checklist
Delete any items that are not applicable to this PR.