[Inference] Inference CLI client#214691
Conversation
|
Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant) |
…nto inference-cli-client
…nto inference-cli-client
| * Base class for all inference API errors. | ||
| */ | ||
| export class InferenceTaskError< | ||
| const InferenceTaskError = ServerSentEventError; |
There was a problem hiding this comment.
@pgayvallet WDYT here? not very proud of this, but:
- translating InferenceTaskError from and to ServerSentEventError everywhere is a lot of work
- they're basically the same
- not aliasing InferenceTaskError to ServerSentEventError is also a little weird, because essentially ServerSentEventError is an implementation detail and specific to SSE.
💔 Build Failed
Failed CI StepsHistory
|
| that is exposed from the `@kbn/kibana-api-cli` package. It automatically selects a | ||
| connector if available. Usage: |
There was a problem hiding this comment.
"It automatically selects a connector if available"
Is it possibly to specify a connector? Say I have connectors from OpenAI and Anthropic in kibana.dev.yml and I want to switch between them.
There was a problem hiding this comment.
yeah the option is there, but I've not added it yet as a flag because I want to figure out some stuff around how to configure flags first. so it'll come later
There was a problem hiding this comment.
After running the recipe a couple of times and having to navigate to the bottom-most connector every time, I'd really appreciate a --connector option
| signal: new AbortController().signal, | ||
| }); | ||
|
|
||
| const response = await inferenceClient.output({ |
There was a problem hiding this comment.
There's no documentation/example for the chatCompletion endpoint. Can you add that, and also a description when to use one over the other.
I assume chatComplete allows me to pass in arbitrary functions whereas output is hardcoded to a single function (structuredOutput)
| Running a recipe: | ||
|
|
||
| ``` | ||
| $ yarn run ts-node x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts |
There was a problem hiding this comment.
Add ts-node --transpileOnly if you want to bypass compile errors:
| $ yarn run ts-node x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts | |
| $ yarn run ts-node --transpileOnly x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts |
I was getting a silly error that shouldn't prevent the script from executing (I need to use the enum I guess) but typescript stalls on compile errors:
There was a problem hiding this comment.
Perhaps wrap this in a script node scripts/inference-client x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts
There was a problem hiding this comment.
for the latter, I'm not a fan of using scripts that way tbh. I have a function defined in my zshrc that prepends the called script with the babel-register stuff, that's probably a better way to do it but I also don't want to be too prescriptive here.
| const output = await inferenceClient.output({}); | ||
|
|
||
| log.info(output); | ||
| }); |
There was a problem hiding this comment.
This example seems pretty short. Intentional?
There was a problem hiding this comment.
yes - there are more verbose examples in kbn-inference-cli
| }); | ||
|
|
||
| log.info(response); | ||
| }); |
There was a problem hiding this comment.
Thanks for adding this! Now also add one for chatComplete :D
sorenlouv
left a comment
There was a problem hiding this comment.
I'd like an example for the chatComplete method and the ability to specify a connector via cli --connector <connector>.
Other than that it lgtm!
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
Exposes an Inference (plugin) API client for scripts, that mimicks the `chatComplete` and `output` APIs that are available on its start contract. It depends on the KibanaClient that is exposed from the `@kbn/kibana-api-cli` package. It automatically selects a connector if available. --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
Exposes an Inference (plugin) API client for scripts, that mimicks the `chatComplete` and `output` APIs that are available on its start contract. It depends on the KibanaClient that is exposed from the `@kbn/kibana-api-cli` package. It automatically selects a connector if available. --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
3 similar comments
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
|
Friendly reminder: Looks like this PR hasn’t been backported yet. |
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
Exposes an Inference (plugin) API client for scripts, that mimicks the `chatComplete` and `output` APIs that are available on its start contract. It depends on the KibanaClient that is exposed from the `@kbn/kibana-api-cli` package. It automatically selects a connector if available. --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 70f1880) # Conflicts: # .github/CODEOWNERS
# Backport This will backport the following commits from `main` to `8.x`: - [[Inference] Inference CLI client (#214691)](#214691) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Dario Gieselaar","email":"dario.gieselaar@elastic.co"},"sourceCommit":{"committedDate":"2025-03-18T12:33:30Z","message":"[Inference] Inference CLI client (#214691)\n\nExposes an Inference (plugin) API client for scripts, that mimicks the\n`chatComplete` and `output` APIs that are available on its start\ncontract. It depends on the KibanaClient that is exposed from the\n`@kbn/kibana-api-cli` package. It automatically selects a connector if\navailable.\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"70f1880e4fae58c6c969b5b19c9d31abf3af4c45","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","backport missing","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Inference] Inference CLI client","number":214691,"url":"https://github.com/elastic/kibana/pull/214691","mergeCommit":{"message":"[Inference] Inference CLI client (#214691)\n\nExposes an Inference (plugin) API client for scripts, that mimicks the\n`chatComplete` and `output` APIs that are available on its start\ncontract. It depends on the KibanaClient that is exposed from the\n`@kbn/kibana-api-cli` package. It automatically selects a connector if\navailable.\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"70f1880e4fae58c6c969b5b19c9d31abf3af4c45"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/214691","number":214691,"mergeCommit":{"message":"[Inference] Inference CLI client (#214691)\n\nExposes an Inference (plugin) API client for scripts, that mimicks the\n`chatComplete` and `output` APIs that are available on its start\ncontract. It depends on the KibanaClient that is exposed from the\n`@kbn/kibana-api-cli` package. It automatically selects a connector if\navailable.\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"70f1880e4fae58c6c969b5b19c9d31abf3af4c45"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Exposes an Inference (plugin) API client for scripts, that mimicks the `chatComplete` and `output` APIs that are available on its start contract. It depends on the KibanaClient that is exposed from the `@kbn/kibana-api-cli` package. It automatically selects a connector if available. --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>

Exposes an Inference (plugin) API client for scripts, that mimicks the
chatCompleteandoutputAPIs that are available on its start contract. It depends on the KibanaClient that is exposed from the@kbn/kibana-api-clipackage. It automatically selects a connector if available.