Skip to content

[ AutoImport] Introduce automatic log type detection graph#190407

Merged
bhapas merged 38 commits intoelastic:mainfrom
bhapas:automatic_log_type_detection
Aug 27, 2024
Merged

[ AutoImport] Introduce automatic log type detection graph#190407
bhapas merged 38 commits intoelastic:mainfrom
bhapas:automatic_log_type_detection

Conversation

@bhapas
Copy link
Copy Markdown
Contributor

@bhapas bhapas commented Aug 13, 2024

Release note

Adds a feature to identify log format type based on the sample input logs

Summary

This PR introduces a new graph in Auto Import called - LogTypeDetection

Currently, only JSON/NDJSON formats are supported to be uploaded for building custom integrations. With this feature the capabilities to upload different log types is allowed.

Although parsing of the new log types will be handled separately with a separate issue.

  • The logs are initially parsed for JSON/NDJSON types in the UI side.
  • If it is not JSON/NDJSON format , then a new API AnalyzeLogs is triggered.
  • UI allows any type of logs to be uploaded.
  • Currently there is a server level content length restriction of 1MB which needs to be extended.
  • For any log types other than JSON/NDJSON the handling graphs are not yet implemented , hence a 501 Not implemented message appears.
  • The idea is to support structured , csv , unstructured syslog handling graphs.

Checklist

Delete any items that are not applicable to this PR.

For maintainers

@bhapas bhapas added enhancement New value added to drive a business result release_note:skip Skip the PR/issue when compiling release notes v8.16.0 Team:Security-Scalability Security Integrations Scalability Team labels Aug 13, 2024
@bhapas bhapas self-assigned this Aug 13, 2024
@bhapas bhapas marked this pull request as ready for review August 14, 2024 19:02
@bhapas bhapas requested a review from a team as a code owner August 14, 2024 19:02
@elasticmachine
Copy link
Copy Markdown
Contributor

Pinging @elastic/security-scalability (Team:Security-Scalability)

@bhapas
Copy link
Copy Markdown
Contributor Author

bhapas commented Aug 15, 2024

@elasticmachine merge upstream

Copy link
Copy Markdown
Member

@P1llus P1llus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added some comments and questions

@bhapas bhapas marked this pull request as draft August 15, 2024 14:36
@bhapas bhapas force-pushed the automatic_log_type_detection branch 4 times, most recently from dfc5705 to d076abb Compare August 20, 2024 12:48
@bhapas bhapas force-pushed the automatic_log_type_detection branch from d076abb to 25f824d Compare August 20, 2024 13:42
@bhapas bhapas changed the title [Security Integrations] [ AutoImport] Introduce automatic log type detection graph [ AutoImport] Introduce automatic log type detection graph Aug 22, 2024
@bhapas bhapas added the release_note:feature Makes this part of the condensed release notes label Aug 22, 2024
Copy link
Copy Markdown
Contributor

@ilyannn ilyannn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It works, though there are more changes in sampling (e.g. function renames) than I think are warranted. If we cannot backport this PR to 8.15 then we might even need to revert these changes to be able to backport other PRs.

@kibana-ci
Copy link
Copy Markdown

💛 Build succeeded, but was flaky

Failed CI Steps

Test Failures

  • [job] [logs] FTR Configs #99 / Cloud Security Posture Test adding Cloud Security Posture Integrations CSPM AZURE Azure Organization Manual Service Principle with Client Secret Azure Organization Manual Service Principle with Client Secret Workflow

Metrics [docs]

Module Count

Fewer modules leads to a faster build time

id before after diff
integrationAssistant 547 548 +1

Public APIs missing comments

Total count of every public API that lacks a comment. Target amount is 0. Run node scripts/build_api_docs --plugin [yourplugin] --stats comments for more detailed information.

id before after diff
integrationAssistant 41 46 +5

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
integrationAssistant 937.8KB 938.7KB +899.0B
Unknown metric groups

API count

id before after diff
integrationAssistant 49 54 +5

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

cc @bhapas

@bhapas bhapas merged commit 9f01f73 into elastic:main Aug 27, 2024
ilyannn added a commit that referenced this pull request Sep 3, 2024
### Release note

Display better error messages for issues with logs sample file upload in
Automatic Import.

## Summary

Previously the user would be told about parse issues that occur after
the file is successfully uploaded. However in the following scenarios
the operation would silently fail without displaying a user-visible
error:

1. When the file fails to upload (e.g. when it is too big).
2. When the upload operation is aborted, e.g. programmatically.

Additionally in the following scenario the generic `CAN_NOT_PARSE`
message was displayed:

3. When the file is uploaded but the browser runs out of memory when
trying to parse it.

Additionally, in the following scenario the `EMPTY` message was
displayed:

4. When the file is too big for the V8 engine (e.g. Chrome) to create a
string so upload process returns an empty string.

Additionally:

5. When the user switches from the invalid file (with an error
displayed) to the valid file, the error from the invalid file was
displayed during the analysis of the new file.

After the changes in this PR, the following error types would be
displayed in these cases, respectively:

1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: {reason}_
2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: An ongoing operation was aborted, typically with a call to
abort()._ (reason is provided by the browser)
3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
5. No error would be displayed during the analysis.

This covers part of elastic/security-team#9844
though the issues were discovered separately.

Note that the fix in item 3 does not work in Firefox as it throws an
`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`
message will continue to be displayed in that case. The fix in item 4 is
only relevant for Chrome.

On a slightly different note, we provide the following improvements to
the log sampling functionality introduced in
#190407:

- Add documentation for the `parseLogsContents` and its special cases
- Refactor the `parseLogsContent` output fields into protocols that
clearly define their optionality
- Add tests for the functionality of sampling when the format cannot be
determined
- Fix so that the error message is displayed for the case where
`fileContent == null` in `onChangeLogsSample`

### Risk Matrix

| Risk | Probability | Severity | Mitigation/Notes |

|---------------------------|-------------|----------|-------------------------|
| Due to the complexity of browser engines, some of them might produce
unexpected events, or events in unexpected order and we will confuse the
user with an incorrect error message. | Low | Low | Testing. |

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
@bhapas bhapas added backport:prev-minor v8.15.2 and removed backport:skip This PR does not require backporting v8.15.2 labels Sep 9, 2024
@kibanamachine kibanamachine added the backport:skip This PR does not require backporting label Sep 9, 2024
@delanni delanni added backport:prev-minor and removed backport:skip This PR does not require backporting labels Sep 9, 2024
@kibanamachine
Copy link
Copy Markdown
Contributor

💔 All backports failed

Status Branch Result
8.15 Backport failed because of merge conflicts

Manual backport

To create the backport manually run:

node scripts/backport --pr 190407

Questions ?

Please refer to the Backport tool documentation

bhapas added a commit to bhapas/kibana that referenced this pull request Sep 9, 2024
…90407)

## Summary

This PR introduces a new graph in `Auto Import` called -
`LogTypeDetection`

Currently, only JSON/NDJSON formats are supported to be uploaded for
building custom integrations. With this feature the capabilities to
upload different log types is allowed.

Although parsing of the new log types will be handled separately with a
separate [issue.](elastic/security-team#9845)

- The logs are initially parsed for JSON/NDJSON types in the UI side.
- If it is not JSON/NDJSON format , then a new API `AnalyzeLogs` is
triggered.
- UI allows any type of logs to be uploaded.
- Currently there is a server level content length restriction of `1MB`
which needs to be extended.
- For any log types other than JSON/NDJSON the handling graphs are not
yet implemented , hence a `501 Not implemented` message appears.
- The idea is to support `structured` , `csv` , `unstructured` syslog
handling graphs.

### Checklist

Delete any items that are not applicable to this PR.

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios

### For maintainers

- [ ] This was checked for breaking API changes and was [labeled
appropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
Co-authored-by: Hanna Tamoudi <hanna.tamoudi@elastic.co>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
(cherry picked from commit 9f01f73)

# Conflicts:
#	x-pack/plugins/integration_assistant/server/types.ts
@bhapas
Copy link
Copy Markdown
Contributor Author

bhapas commented Sep 9, 2024

💚 All backports created successfully

Status Branch Result
8.15

Note: Successful backport PRs will be merged automatically after passing CI.

Questions ?

Please refer to the Backport tool documentation

bhapas added a commit that referenced this pull request Sep 9, 2024
…90407) (#192403)

# Backport

This will backport the following commits from `main` to `8.15`:
- [[ AutoImport] Introduce automatic log type detection graph
(#190407)](#190407)

<!--- Backport version: 8.9.8 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Bharat
Pasupula","email":"123897612+bhapas@users.noreply.github.com"},"sourceCommit":{"committedDate":"2024-08-27T16:14:27Z","message":"[
AutoImport] Introduce automatic log type detection graph (#190407)\n\n##
Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called
-\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are
supported to be uploaded for\r\nbuilding custom integrations. With this
feature the capabilities to\r\nupload different log types is
allowed.\r\n\r\nAlthough parsing of the new log types will be handled
separately with a\r\nseparate
[issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n-
The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n-
If it is not JSON/NDJSON format , then a new API `AnalyzeLogs`
is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n-
Currently there is a server level content length restriction of
`1MB`\r\nwhich needs to be extended.\r\n- For any log types other than
JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a
`501 Not implemented` message appears.\r\n- The idea is to support
`structured` , `csv` , `unstructured` syslog\r\nhandling
graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not
applicable to this PR.\r\n\r\n- [x] [Unit or
functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere
updated or added to match the most common scenarios\r\n\r\n### For
maintainers\r\n\r\n- [ ] This was checked for breaking API changes and
was
[labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by:
Elastic Machine
<elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna
Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6","branchLabelMapping":{"^v8.16.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["enhancement","release_note:feature","backport:prev-minor","v8.16.0","Team:Security-Scalability"],"number":190407,"url":"https://github.com/elastic/kibana/pull/190407","mergeCommit":{"message":"[
AutoImport] Introduce automatic log type detection graph (#190407)\n\n##
Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called
-\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are
supported to be uploaded for\r\nbuilding custom integrations. With this
feature the capabilities to\r\nupload different log types is
allowed.\r\n\r\nAlthough parsing of the new log types will be handled
separately with a\r\nseparate
[issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n-
The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n-
If it is not JSON/NDJSON format , then a new API `AnalyzeLogs`
is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n-
Currently there is a server level content length restriction of
`1MB`\r\nwhich needs to be extended.\r\n- For any log types other than
JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a
`501 Not implemented` message appears.\r\n- The idea is to support
`structured` , `csv` , `unstructured` syslog\r\nhandling
graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not
applicable to this PR.\r\n\r\n- [x] [Unit or
functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere
updated or added to match the most common scenarios\r\n\r\n### For
maintainers\r\n\r\n- [ ] This was checked for breaking API changes and
was
[labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by:
Elastic Machine
<elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna
Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v8.16.0","labelRegex":"^v8.16.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/190407","number":190407,"mergeCommit":{"message":"[
AutoImport] Introduce automatic log type detection graph (#190407)\n\n##
Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called
-\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are
supported to be uploaded for\r\nbuilding custom integrations. With this
feature the capabilities to\r\nupload different log types is
allowed.\r\n\r\nAlthough parsing of the new log types will be handled
separately with a\r\nseparate
[issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n-
The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n-
If it is not JSON/NDJSON format , then a new API `AnalyzeLogs`
is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n-
Currently there is a server level content length restriction of
`1MB`\r\nwhich needs to be extended.\r\n- For any log types other than
JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a
`501 Not implemented` message appears.\r\n- The idea is to support
`structured` , `csv` , `unstructured` syslog\r\nhandling
graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not
applicable to this PR.\r\n\r\n- [x] [Unit or
functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere
updated or added to match the most common scenarios\r\n\r\n### For
maintainers\r\n\r\n- [ ] This was checked for breaking API changes and
was
[labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by:
Elastic Machine
<elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna
Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6"}}]}]
BACKPORT-->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
bhapas pushed a commit to bhapas/kibana that referenced this pull request Sep 10, 2024
### Release note

Display better error messages for issues with logs sample file upload in
Automatic Import.

## Summary

Previously the user would be told about parse issues that occur after
the file is successfully uploaded. However in the following scenarios
the operation would silently fail without displaying a user-visible
error:

1. When the file fails to upload (e.g. when it is too big).
2. When the upload operation is aborted, e.g. programmatically.

Additionally in the following scenario the generic `CAN_NOT_PARSE`
message was displayed:

3. When the file is uploaded but the browser runs out of memory when
trying to parse it.

Additionally, in the following scenario the `EMPTY` message was
displayed:

4. When the file is too big for the V8 engine (e.g. Chrome) to create a
string so upload process returns an empty string.

Additionally:

5. When the user switches from the invalid file (with an error
displayed) to the valid file, the error from the invalid file was
displayed during the analysis of the new file.

After the changes in this PR, the following error types would be
displayed in these cases, respectively:

1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: {reason}_
2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: An ongoing operation was aborted, typically with a call to
abort()._ (reason is provided by the browser)
3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
5. No error would be displayed during the analysis.

This covers part of elastic/security-team#9844
though the issues were discovered separately.

Note that the fix in item 3 does not work in Firefox as it throws an
`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`
message will continue to be displayed in that case. The fix in item 4 is
only relevant for Chrome.

On a slightly different note, we provide the following improvements to
the log sampling functionality introduced in
elastic#190407:

- Add documentation for the `parseLogsContents` and its special cases
- Refactor the `parseLogsContent` output fields into protocols that
clearly define their optionality
- Add tests for the functionality of sampling when the format cannot be
determined
- Fix so that the error message is displayed for the case where
`fileContent == null` in `onChangeLogsSample`

### Risk Matrix

| Risk | Probability | Severity | Mitigation/Notes |

|---------------------------|-------------|----------|-------------------------|
| Due to the complexity of browser engines, some of them might produce
unexpected events, or events in unexpected order and we will confuse the
user with an incorrect error message. | Low | Low | Testing. |

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
(cherry picked from commit 16c2bfe)
bhapas added a commit that referenced this pull request Sep 10, 2024
…) (#192464)

# Backport

This will backport the following commits from `main` to `8.15`:
- [[Automatic Import] Error handling when uploading a file
(#191310)](#191310)

<!--- Backport version: 8.9.8 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Ilya
Nikokoshev","email":"ilya.nikokoshev@elastic.co"},"sourceCommit":{"committedDate":"2024-09-03T11:39:05Z","message":"[Automatic
Import] Error handling when uploading a file (#191310)\n\n### Release
note\r\n\r\nDisplay better error messages for issues with logs sample
file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously
the user would be told about parse issues that occur after\r\nthe file
is successfully uploaded. However in the following scenarios\r\nthe
operation would silently fail without displaying a
user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g.
when it is too big).\r\n2. When the upload operation is aborted, e.g.
programmatically.\r\n\r\nAdditionally in the following scenario the
generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the
file is uploaded but the browser runs out of memory when\r\ntrying to
parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY`
message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8
engine (e.g. Chrome) to create a\r\nstring so upload process returns an
empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from
the invalid file (with an error\r\ndisplayed) to the valid file, the
error from the invalid file was\r\ndisplayed during the analysis of the
new file.\r\n\r\nAfter the changes in this PR, the following error types
would be\r\ndisplayed in these cases, respectively:\r\n\r\n1.
`CAN_NOT_READ_WITH_REASON`: _An error occurred when reading
logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error
occurred when reading logs\r\nsample: An ongoing operation was aborted,
typically with a call to\r\nabort()._ (reason is provided by the
browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large
to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too
large to parse_\r\n5. No error would be displayed during the
analysis.\r\n\r\nThis covers part of
https://github.com/elastic/security-team/issues/9844\r\nthough the
issues were discovered separately.\r\n\r\nNote that the fix in item 3
does not work in Firefox as it throws an\r\n`InternalError` rather than
`RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be
displayed in that case. The fix in item 4 is\r\nonly relevant for
Chrome.\r\n\r\nOn a slightly different note, we provide the following
improvements to\r\nthe log sampling functionality introduced
in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add
documentation for the `parseLogsContents` and its special cases\r\n-
Refactor the `parseLogsContent` output fields into protocols
that\r\nclearly define their optionality\r\n- Add tests for the
functionality of sampling when the format cannot be\r\ndetermined\r\n-
Fix so that the error message is displayed for the case
where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk
Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes
|\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n|
Due to the complexity of browser engines, some of them might
produce\r\nunexpected events, or events in unexpected order and we will
confuse the\r\nuser with an incorrect error message. | Low | Low |
Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine
<elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7","branchLabelMapping":{"^v8.16.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:enhancement","enhancement","backport:prev-minor","v8.16.0","Team:Security-Scalability"],"number":191310,"url":"https://github.com/elastic/kibana/pull/191310","mergeCommit":{"message":"[Automatic
Import] Error handling when uploading a file (#191310)\n\n### Release
note\r\n\r\nDisplay better error messages for issues with logs sample
file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously
the user would be told about parse issues that occur after\r\nthe file
is successfully uploaded. However in the following scenarios\r\nthe
operation would silently fail without displaying a
user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g.
when it is too big).\r\n2. When the upload operation is aborted, e.g.
programmatically.\r\n\r\nAdditionally in the following scenario the
generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the
file is uploaded but the browser runs out of memory when\r\ntrying to
parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY`
message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8
engine (e.g. Chrome) to create a\r\nstring so upload process returns an
empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from
the invalid file (with an error\r\ndisplayed) to the valid file, the
error from the invalid file was\r\ndisplayed during the analysis of the
new file.\r\n\r\nAfter the changes in this PR, the following error types
would be\r\ndisplayed in these cases, respectively:\r\n\r\n1.
`CAN_NOT_READ_WITH_REASON`: _An error occurred when reading
logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error
occurred when reading logs\r\nsample: An ongoing operation was aborted,
typically with a call to\r\nabort()._ (reason is provided by the
browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large
to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too
large to parse_\r\n5. No error would be displayed during the
analysis.\r\n\r\nThis covers part of
https://github.com/elastic/security-team/issues/9844\r\nthough the
issues were discovered separately.\r\n\r\nNote that the fix in item 3
does not work in Firefox as it throws an\r\n`InternalError` rather than
`RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be
displayed in that case. The fix in item 4 is\r\nonly relevant for
Chrome.\r\n\r\nOn a slightly different note, we provide the following
improvements to\r\nthe log sampling functionality introduced
in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add
documentation for the `parseLogsContents` and its special cases\r\n-
Refactor the `parseLogsContent` output fields into protocols
that\r\nclearly define their optionality\r\n- Add tests for the
functionality of sampling when the format cannot be\r\ndetermined\r\n-
Fix so that the error message is displayed for the case
where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk
Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes
|\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n|
Due to the complexity of browser engines, some of them might
produce\r\nunexpected events, or events in unexpected order and we will
confuse the\r\nuser with an incorrect error message. | Low | Low |
Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine
<elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v8.16.0","labelRegex":"^v8.16.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/191310","number":191310,"mergeCommit":{"message":"[Automatic
Import] Error handling when uploading a file (#191310)\n\n### Release
note\r\n\r\nDisplay better error messages for issues with logs sample
file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously
the user would be told about parse issues that occur after\r\nthe file
is successfully uploaded. However in the following scenarios\r\nthe
operation would silently fail without displaying a
user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g.
when it is too big).\r\n2. When the upload operation is aborted, e.g.
programmatically.\r\n\r\nAdditionally in the following scenario the
generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the
file is uploaded but the browser runs out of memory when\r\ntrying to
parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY`
message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8
engine (e.g. Chrome) to create a\r\nstring so upload process returns an
empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from
the invalid file (with an error\r\ndisplayed) to the valid file, the
error from the invalid file was\r\ndisplayed during the analysis of the
new file.\r\n\r\nAfter the changes in this PR, the following error types
would be\r\ndisplayed in these cases, respectively:\r\n\r\n1.
`CAN_NOT_READ_WITH_REASON`: _An error occurred when reading
logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error
occurred when reading logs\r\nsample: An ongoing operation was aborted,
typically with a call to\r\nabort()._ (reason is provided by the
browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large
to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too
large to parse_\r\n5. No error would be displayed during the
analysis.\r\n\r\nThis covers part of
https://github.com/elastic/security-team/issues/9844\r\nthough the
issues were discovered separately.\r\n\r\nNote that the fix in item 3
does not work in Firefox as it throws an\r\n`InternalError` rather than
`RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be
displayed in that case. The fix in item 4 is\r\nonly relevant for
Chrome.\r\n\r\nOn a slightly different note, we provide the following
improvements to\r\nthe log sampling functionality introduced
in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add
documentation for the `parseLogsContents` and its special cases\r\n-
Refactor the `parseLogsContent` output fields into protocols
that\r\nclearly define their optionality\r\n- Add tests for the
functionality of sampling when the format cannot be\r\ndetermined\r\n-
Fix so that the error message is displayed for the case
where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk
Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes
|\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n|
Due to the complexity of browser engines, some of them might
produce\r\nunexpected events, or events in unexpected order and we will
confuse the\r\nuser with an incorrect error message. | Low | Low |
Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine
<elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7"}}]}]
BACKPORT-->

Co-authored-by: Ilya Nikokoshev <ilya.nikokoshev@elastic.co>
bhapas added a commit that referenced this pull request Sep 26, 2024
## Release Note

Fixes a bug that is causing the deploy step to fail after a pipeline
edit/save.

## Summary

#190407 introduced a bug that deployment fails when a pipeline is edited
and saved in the review step.

The issue is that after the edit pipeline flow is executed the review
step's result is overridden and `samplesFormat` is removed which if not
present [the `useEffect` in Deploy
step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41)
fails.

This PR fixes the issue by saving the `samplesFormat` that is present in
the original result before the edit pipeline flow is executed there by
having samplesFormat in the result.
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Sep 26, 2024
…ic#194203)

## Release Note

Fixes a bug that is causing the deploy step to fail after a pipeline
edit/save.

## Summary

elastic#190407 introduced a bug that deployment fails when a pipeline is edited
and saved in the review step.

The issue is that after the edit pipeline flow is executed the review
step's result is overridden and `samplesFormat` is removed which if not
present [the `useEffect` in Deploy
step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41)
fails.

This PR fixes the issue by saving the `samplesFormat` that is present in
the original result before the edit pipeline flow is executed there by
having samplesFormat in the result.

(cherry picked from commit 6366dc3)
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Sep 26, 2024
…ic#194203)

## Release Note

Fixes a bug that is causing the deploy step to fail after a pipeline
edit/save.

## Summary

elastic#190407 introduced a bug that deployment fails when a pipeline is edited
and saved in the review step.

The issue is that after the edit pipeline flow is executed the review
step's result is overridden and `samplesFormat` is removed which if not
present [the `useEffect` in Deploy
step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41)
fails.

This PR fixes the issue by saving the `samplesFormat` that is present in
the original result before the edit pipeline flow is executed there by
having samplesFormat in the result.

(cherry picked from commit 6366dc3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New value added to drive a business result release_note:feature Makes this part of the condensed release notes Team:Security-Scalability Security Integrations Scalability Team v8.15.2 v8.16.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants