[Automatic Import ] Enable inference connector for Auto Import#206111
Merged
bhapas merged 7 commits intoelastic:mainfrom Jan 9, 2025
Merged
[Automatic Import ] Enable inference connector for Auto Import#206111bhapas merged 7 commits intoelastic:mainfrom
bhapas merged 7 commits intoelastic:mainfrom
Conversation
Contributor
|
Pinging @elastic/security-solution (Team: SecuritySolution) |
Contributor
|
Pinging @elastic/security-scalability (Team:Security-Scalability) |
stephmilovic
reviewed
Jan 9, 2025
...tform/packages/shared/kbn-elastic-assistant/impl/connectorland/use_load_connectors/index.tsx
Outdated
Show resolved
Hide resolved
4 tasks
stephmilovic
approved these changes
Jan 9, 2025
haetamoudi
approved these changes
Jan 9, 2025
Contributor
haetamoudi
left a comment
There was a problem hiding this comment.
LGTM, I have tested the flow with automatic import
This reverts commit e1cbb58.
Contributor
|
Starting backport for target branches: 8.x https://github.com/elastic/kibana/actions/runs/12699245739 |
Contributor
💚 Build Succeeded
Metrics [docs]Async chunks
Page load bundle
Unknown metric groupsESLint disabled line counts
Total ESLint disabled count
History
cc @bhapas |
kibanamachine
pushed a commit
to kibanamachine/kibana
that referenced
this pull request
Jan 9, 2025
…ic#206111) ## Summary Enables new inference connector in the Automatic Import. This PR also fixes the use of `inferenceEnabled` from `useAssistantContext` since it is not available in AutoImport. ## To test 1. Update the value for `inferenceConnectorOn` to `true` in `x-pack/platform/plugins/shared/stack_connectors/common/experimental_features.ts` 2. Create an inference connector using [OpenAI creds](https://p.elstc.co/paste/36VivuC+#TnP7-Z7wBKDUg8fQ/lTycSCdwUxEEbHcyQ/Q0i3oEmO). Configure the inference endpoint for completion and name the endpoint `openai-completion-preconfig` 3. Now that the inference endpoint is created, add a [preconfigured connector](https://p.elstc.co/paste/tFWF3LSA#0thBRW05e6KSSkLCDjQiH8GkECQySBiHm6zRMCUThlf) with the same credentials. 4. Select the preconfigured selector in Automatic Import. 5. Test the Auto Import flow works. --------- Co-authored-by: Steph Milovic <stephanie.milovic@elastic.co> (cherry picked from commit 668d88e)
Contributor
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
kibanamachine
added a commit
that referenced
this pull request
Jan 9, 2025
…206111) (#206146) # Backport This will backport the following commits from `main` to `8.x`: - [[Automatic Import ] Enable inference connector for Auto Import (#206111)](#206111) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Bharat Pasupula","email":"123897612+bhapas@users.noreply.github.com"},"sourceCommit":{"committedDate":"2025-01-09T22:04:33Z","message":"[Automatic Import ] Enable inference connector for Auto Import (#206111)\n\n## Summary\r\n\r\nEnables new inference connector in the Automatic Import.\r\n\r\nThis PR also fixes the use of `inferenceEnabled` from\r\n`useAssistantContext` since it is not available in AutoImport.\r\n\r\n## To test\r\n\r\n1. Update the value for `inferenceConnectorOn` to `true` in\r\n`x-pack/platform/plugins/shared/stack_connectors/common/experimental_features.ts`\r\n2. Create an inference connector using [OpenAI\r\ncreds](https://p.elstc.co/paste/36VivuC+#TnP7-Z7wBKDUg8fQ/lTycSCdwUxEEbHcyQ/Q0i3oEmO).\r\nConfigure the inference endpoint for completion and name the endpoint\r\n`openai-completion-preconfig`\r\n3. Now that the inference endpoint is created, add a [preconfigured\r\nconnector](https://p.elstc.co/paste/tFWF3LSA#0thBRW05e6KSSkLCDjQiH8GkECQySBiHm6zRMCUThlf)\r\nwith the same credentials.\r\n4. Select the preconfigured selector in Automatic Import.\r\n5. Test the Auto Import flow works.\r\n\r\n---------\r\n\r\nCo-authored-by: Steph Milovic <stephanie.milovic@elastic.co>","sha":"668d88e19e6c8e0b28290ecad377649f507716a3","branchLabelMapping":{"^v9.0.0$":"main","^v8.18.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:enhancement","v9.0.0","Team: SecuritySolution","backport:prev-minor","Team:Security-Scalability"],"title":"[Automatic Import ] Enable inference connector for Auto Import","number":206111,"url":"https://github.com/elastic/kibana/pull/206111","mergeCommit":{"message":"[Automatic Import ] Enable inference connector for Auto Import (#206111)\n\n## Summary\r\n\r\nEnables new inference connector in the Automatic Import.\r\n\r\nThis PR also fixes the use of `inferenceEnabled` from\r\n`useAssistantContext` since it is not available in AutoImport.\r\n\r\n## To test\r\n\r\n1. Update the value for `inferenceConnectorOn` to `true` in\r\n`x-pack/platform/plugins/shared/stack_connectors/common/experimental_features.ts`\r\n2. Create an inference connector using [OpenAI\r\ncreds](https://p.elstc.co/paste/36VivuC+#TnP7-Z7wBKDUg8fQ/lTycSCdwUxEEbHcyQ/Q0i3oEmO).\r\nConfigure the inference endpoint for completion and name the endpoint\r\n`openai-completion-preconfig`\r\n3. Now that the inference endpoint is created, add a [preconfigured\r\nconnector](https://p.elstc.co/paste/tFWF3LSA#0thBRW05e6KSSkLCDjQiH8GkECQySBiHm6zRMCUThlf)\r\nwith the same credentials.\r\n4. Select the preconfigured selector in Automatic Import.\r\n5. Test the Auto Import flow works.\r\n\r\n---------\r\n\r\nCo-authored-by: Steph Milovic <stephanie.milovic@elastic.co>","sha":"668d88e19e6c8e0b28290ecad377649f507716a3"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/206111","number":206111,"mergeCommit":{"message":"[Automatic Import ] Enable inference connector for Auto Import (#206111)\n\n## Summary\r\n\r\nEnables new inference connector in the Automatic Import.\r\n\r\nThis PR also fixes the use of `inferenceEnabled` from\r\n`useAssistantContext` since it is not available in AutoImport.\r\n\r\n## To test\r\n\r\n1. Update the value for `inferenceConnectorOn` to `true` in\r\n`x-pack/platform/plugins/shared/stack_connectors/common/experimental_features.ts`\r\n2. Create an inference connector using [OpenAI\r\ncreds](https://p.elstc.co/paste/36VivuC+#TnP7-Z7wBKDUg8fQ/lTycSCdwUxEEbHcyQ/Q0i3oEmO).\r\nConfigure the inference endpoint for completion and name the endpoint\r\n`openai-completion-preconfig`\r\n3. Now that the inference endpoint is created, add a [preconfigured\r\nconnector](https://p.elstc.co/paste/tFWF3LSA#0thBRW05e6KSSkLCDjQiH8GkECQySBiHm6zRMCUThlf)\r\nwith the same credentials.\r\n4. Select the preconfigured selector in Automatic Import.\r\n5. Test the Auto Import flow works.\r\n\r\n---------\r\n\r\nCo-authored-by: Steph Milovic <stephanie.milovic@elastic.co>","sha":"668d88e19e6c8e0b28290ecad377649f507716a3"}}]}] BACKPORT--> Co-authored-by: Bharat Pasupula <123897612+bhapas@users.noreply.github.com>
viduni94
pushed a commit
to viduni94/kibana
that referenced
this pull request
Jan 23, 2025
…ic#206111) ## Summary Enables new inference connector in the Automatic Import. This PR also fixes the use of `inferenceEnabled` from `useAssistantContext` since it is not available in AutoImport. ## To test 1. Update the value for `inferenceConnectorOn` to `true` in `x-pack/platform/plugins/shared/stack_connectors/common/experimental_features.ts` 2. Create an inference connector using [OpenAI creds](https://p.elstc.co/paste/36VivuC+#TnP7-Z7wBKDUg8fQ/lTycSCdwUxEEbHcyQ/Q0i3oEmO). Configure the inference endpoint for completion and name the endpoint `openai-completion-preconfig` 3. Now that the inference endpoint is created, add a [preconfigured connector](https://p.elstc.co/paste/tFWF3LSA#0thBRW05e6KSSkLCDjQiH8GkECQySBiHm6zRMCUThlf) with the same credentials. 4. Select the preconfigured selector in Automatic Import. 5. Test the Auto Import flow works. --------- Co-authored-by: Steph Milovic <stephanie.milovic@elastic.co>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Enables new inference connector in the Automatic Import.
This PR also fixes the use of
inferenceEnabledfromuseAssistantContextsince it is not available in AutoImport.To test
inferenceConnectorOntotrueinx-pack/platform/plugins/shared/stack_connectors/common/experimental_features.tsopenai-completion-preconfig