Skip to content

[Security Solution] GenAI API Integration Tests#176357

Merged
stephmilovic merged 10 commits intoelastic:mainfrom
stephmilovic:genai_api_integration
Feb 7, 2024
Merged

[Security Solution] GenAI API Integration Tests#176357
stephmilovic merged 10 commits intoelastic:mainfrom
stephmilovic:genai_api_integration

Conversation

@stephmilovic
Copy link
Contributor

@stephmilovic stephmilovic commented Feb 7, 2024

Summary

Add API tests for Security GenAI team.

Starting with very basic tests to keep this PR about setting up the GenAI test suite with CODEOWNERS and CI execution. Will follow up with comprehensive tests.

@stephmilovic stephmilovic added release_note:skip Skip the PR/issue when compiling release notes Feature:GenAI v8.13.0 Team:Security Generative AI Security Generative AI labels Feb 7, 2024
export default createTestConfig({
kbnTestServerArgs: [
// used for connector simulators
`--xpack.actions.proxyUrl=http://localhost:6200`,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Idk that I can use get-port here. I'm wondering if @YulNaumenko can speak to the importance of the dynamic port number here as I see your name in the git blame from the spot I stole this code from (x-pack/test/alerting_api_integration/common/config.ts) https://github.com/elastic/kibana/pull/75232/files#diff-1477d89e2965e56261180479a84bfb0e762793a73b88baa29300a7ed023d0f28R61-R62

@stephmilovic stephmilovic marked this pull request as ready for review February 7, 2024 16:15
@stephmilovic stephmilovic requested a review from a team as a code owner February 7, 2024 16:15
Copy link
Contributor

@jbudz jbudz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ftr_configs.yml

@stephmilovic
Copy link
Contributor Author

@elasticmachine merge upstream

@kibana-ci
Copy link

💚 Build Succeeded

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@stephmilovic stephmilovic merged commit 9bca7ed into elastic:main Feb 7, 2024
@kibanamachine kibanamachine added the backport:skip This PR does not require backporting label Feb 7, 2024
@stephmilovic stephmilovic deleted the genai_api_integration branch February 7, 2024 22:21
fkanout pushed a commit to fkanout/kibana that referenced this pull request Feb 8, 2024
dgieselaar added a commit that referenced this pull request Feb 13, 2024
## Summary

This adds functionality to allow consumers of the AI Assistant for
Observability to add context to the LLM conversation, at the start of a
conversation and contextual after every prompt.


https://github.com/elastic/kibana/assets/535564/b4d62897-d701-4c23-b90b-464cad21e9d0


![image](https://github.com/elastic/kibana/assets/352732/85a7e27a-e715-4273-a3a3-0dd68a7c9c5c)


## How to use
The service now exposes a `setApplicationContext` function, that returns
a hook to unregister the context. Here's an example:



Consumers can use this to add context relevant to the route or settings
of Kibana that might be relevant for the LLM within that conversation.

Example:

```ts
useEffect(() => {
  return setApplicationContext({
    data: [
      {
        name: 'top_transactions',
        description: 'The visible transaction groups',
        value: mainStatistics.transactionGroups.map((group) => {
          return {
            name: group.name,
            alertsCount: group.alertsCount,
          };
        }),
      },
    ],
  });
}, [setApplicationContext, mainStatistics]);
```

By default the URL that the user is currently on is always included in
the context so the Assistant will always take that into account.

## Details for reviewers
- `recall` function has been renamed to `context`
- `context` function now returns both Knowledge base entries as well as
chat context that is set by `setApplicationContext`.#176357
- part of the function logic was moved from the
ObservabilityAIAssistantService into the ChatFunctionClient, for easier
testing

---------

Co-authored-by: Dario Gieselaar <dario.gieselaar@elastic.co>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
CoenWarmer pushed a commit to CoenWarmer/kibana that referenced this pull request Feb 15, 2024
CoenWarmer added a commit to CoenWarmer/kibana that referenced this pull request Feb 15, 2024
## Summary

This adds functionality to allow consumers of the AI Assistant for
Observability to add context to the LLM conversation, at the start of a
conversation and contextual after every prompt.


https://github.com/elastic/kibana/assets/535564/b4d62897-d701-4c23-b90b-464cad21e9d0


![image](https://github.com/elastic/kibana/assets/352732/85a7e27a-e715-4273-a3a3-0dd68a7c9c5c)


## How to use
The service now exposes a `setApplicationContext` function, that returns
a hook to unregister the context. Here's an example:



Consumers can use this to add context relevant to the route or settings
of Kibana that might be relevant for the LLM within that conversation.

Example:

```ts
useEffect(() => {
  return setApplicationContext({
    data: [
      {
        name: 'top_transactions',
        description: 'The visible transaction groups',
        value: mainStatistics.transactionGroups.map((group) => {
          return {
            name: group.name,
            alertsCount: group.alertsCount,
          };
        }),
      },
    ],
  });
}, [setApplicationContext, mainStatistics]);
```

By default the URL that the user is currently on is always included in
the context so the Assistant will always take that into account.

## Details for reviewers
- `recall` function has been renamed to `context`
- `context` function now returns both Knowledge base entries as well as
chat context that is set by `setApplicationContext`.elastic#176357
- part of the function logic was moved from the
ObservabilityAIAssistantService into the ChatFunctionClient, for easier
testing

---------

Co-authored-by: Dario Gieselaar <dario.gieselaar@elastic.co>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
fkanout pushed a commit to fkanout/kibana that referenced this pull request Mar 4, 2024
fkanout pushed a commit to fkanout/kibana that referenced this pull request Mar 4, 2024
## Summary

This adds functionality to allow consumers of the AI Assistant for
Observability to add context to the LLM conversation, at the start of a
conversation and contextual after every prompt.


https://github.com/elastic/kibana/assets/535564/b4d62897-d701-4c23-b90b-464cad21e9d0


![image](https://github.com/elastic/kibana/assets/352732/85a7e27a-e715-4273-a3a3-0dd68a7c9c5c)


## How to use
The service now exposes a `setApplicationContext` function, that returns
a hook to unregister the context. Here's an example:



Consumers can use this to add context relevant to the route or settings
of Kibana that might be relevant for the LLM within that conversation.

Example:

```ts
useEffect(() => {
  return setApplicationContext({
    data: [
      {
        name: 'top_transactions',
        description: 'The visible transaction groups',
        value: mainStatistics.transactionGroups.map((group) => {
          return {
            name: group.name,
            alertsCount: group.alertsCount,
          };
        }),
      },
    ],
  });
}, [setApplicationContext, mainStatistics]);
```

By default the URL that the user is currently on is always included in
the context so the Assistant will always take that into account.

## Details for reviewers
- `recall` function has been renamed to `context`
- `context` function now returns both Knowledge base entries as well as
chat context that is set by `setApplicationContext`.elastic#176357
- part of the function logic was moved from the
ObservabilityAIAssistantService into the ChatFunctionClient, for easier
testing

---------

Co-authored-by: Dario Gieselaar <dario.gieselaar@elastic.co>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:skip This PR does not require backporting Feature:GenAI release_note:skip Skip the PR/issue when compiling release notes Team:Security Generative AI Security Generative AI v8.13.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants