[Security solution] Fix OpenAI token reporting#169156
[Security solution] Fix OpenAI token reporting#169156stephmilovic merged 5 commits intoelastic:mainfrom
Conversation
|
Pinging @elastic/security-threat-hunting (Team:Threat Hunting) |
|
Pinging @elastic/security-solution (Team: SecuritySolution) |
💚 Build Succeeded
Metrics [docs]Async chunks
History
To update your PR or re-run it, just comment with: |
| // if not langchain, call execute action directly and return the response: | ||
| if (!request.body.assistantLangChain) { | ||
| const result = await executeAction({ actions, request, connectorId }); | ||
| return response.ok({ | ||
| body: result, | ||
| }); | ||
| } | ||
|
|
There was a problem hiding this comment.
Did assistantLangChain need to be pushed server side for this fix? @andrew-goldstein intentionally kept the split in logic between calling the actions framework, and the new langchain implementation at the client API layer for separation of concerns. There's no intent to continue supporting an actions 'pass through' once this new API is validated (at least AFAIK), so unless this is somehow required for the fix I'd be hesitant to push this logic into this route.
There was a problem hiding this comment.
Looks like this is going to be necessary for the upcoming streaming work, so all good! 👍
There was a problem hiding this comment.
Yes, decided to move it new because I wanted to parse the response to continue returning a string from this endpoint, and I knew I'd be moving it for streaming anyways.
spong
left a comment
There was a problem hiding this comment.
Checked out, tested locally, and code reviewed. Seeing token counts for OpenAI connectors in the event log, and the langchain code paths appear to be functioning as expected -- LGTM! 👍
💔 All backports failed
Manual backportTo create the backport manually run: Questions ?Please refer to the Backport tool documentation |
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
(cherry picked from commit 8284398) # Conflicts: # x-pack/packages/kbn-elastic-assistant/impl/assistant/api.test.tsx
) # Backport This will backport the following commits from `main` to `8.11`: - [[Security solution] Fix OpenAI token reporting (#169156)](#169156) <!--- Backport version: 8.9.8 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Steph Milovic","email":"stephanie.milovic@elastic.co"},"sourceCommit":{"committedDate":"2023-10-18T18:06:03Z","message":"[Security solution] Fix OpenAI token reporting (#169156)","sha":"8284398023648e850a8ca038bce8be6cc85cc51f","branchLabelMapping":{"^v8.12.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","Team:Threat Hunting","Team: SecuritySolution","v8.11.0","v8.12.0","v8.11.1"],"number":169156,"url":"https://github.com/elastic/kibana/pull/169156","mergeCommit":{"message":"[Security solution] Fix OpenAI token reporting (#169156)","sha":"8284398023648e850a8ca038bce8be6cc85cc51f"}},"sourceBranch":"main","suggestedTargetBranches":["8.11"],"targetPullRequestStates":[{"branch":"8.11","label":"v8.11.0","labelRegex":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v8.12.0","labelRegex":"^v8.12.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/169156","number":169156,"mergeCommit":{"message":"[Security solution] Fix OpenAI token reporting (#169156)","sha":"8284398023648e850a8ca038bce8be6cc85cc51f"}}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
The OpenAI token reporting inadvertently got broken for security assistant in this PR.
When I implemented the
invokeAImethod, I returned the message as a string. However, I did not account for the fact that the action executor is looking for the usage object to report those metrics. Instead of just a string,invokeAIshould return an object like{ message: string; usage: UsageObject }.