[ AutoImport] Introduce automatic log type detection graph#190407
[ AutoImport] Introduce automatic log type detection graph#190407bhapas merged 38 commits intoelastic:mainfrom
Conversation
|
Pinging @elastic/security-scalability (Team:Security-Scalability) |
|
@elasticmachine merge upstream |
P1llus
left a comment
There was a problem hiding this comment.
Added some comments and questions
.../create_integration/create_integration_assistant/steps/data_stream_step/generation_modal.tsx
Outdated
Show resolved
Hide resolved
x-pack/plugins/integration_assistant/server/graphs/log_type_detection/constants.ts
Outdated
Show resolved
Hide resolved
x-pack/plugins/integration_assistant/server/graphs/log_type_detection/prompts.ts
Show resolved
Hide resolved
x-pack/plugins/integration_assistant/server/routes/ecs_routes.ts
Outdated
Show resolved
Hide resolved
x-pack/plugins/integration_assistant/server/routes/ecs_routes.ts
Outdated
Show resolved
Hide resolved
dfc5705 to
d076abb
Compare
d076abb to
25f824d
Compare
ilyannn
left a comment
There was a problem hiding this comment.
It works, though there are more changes in sampling (e.g. function renames) than I think are warranted. If we cannot backport this PR to 8.15 then we might even need to revert these changes to be able to backport other PRs.
...create_integration/create_integration_assistant/steps/data_stream_step/sample_logs_input.tsx
Outdated
Show resolved
Hide resolved
...create_integration/create_integration_assistant/steps/data_stream_step/sample_logs_input.tsx
Outdated
Show resolved
Hide resolved
💛 Build succeeded, but was flaky
Failed CI StepsTest Failures
Metrics [docs]Module Count
Public APIs missing comments
Async chunks
History
To update your PR or re-run it, just comment with: cc @bhapas |
### Release note
Display better error messages for issues with logs sample file upload in
Automatic Import.
## Summary
Previously the user would be told about parse issues that occur after
the file is successfully uploaded. However in the following scenarios
the operation would silently fail without displaying a user-visible
error:
1. When the file fails to upload (e.g. when it is too big).
2. When the upload operation is aborted, e.g. programmatically.
Additionally in the following scenario the generic `CAN_NOT_PARSE`
message was displayed:
3. When the file is uploaded but the browser runs out of memory when
trying to parse it.
Additionally, in the following scenario the `EMPTY` message was
displayed:
4. When the file is too big for the V8 engine (e.g. Chrome) to create a
string so upload process returns an empty string.
Additionally:
5. When the user switches from the invalid file (with an error
displayed) to the valid file, the error from the invalid file was
displayed during the analysis of the new file.
After the changes in this PR, the following error types would be
displayed in these cases, respectively:
1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: {reason}_
2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: An ongoing operation was aborted, typically with a call to
abort()._ (reason is provided by the browser)
3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
5. No error would be displayed during the analysis.
This covers part of elastic/security-team#9844
though the issues were discovered separately.
Note that the fix in item 3 does not work in Firefox as it throws an
`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`
message will continue to be displayed in that case. The fix in item 4 is
only relevant for Chrome.
On a slightly different note, we provide the following improvements to
the log sampling functionality introduced in
#190407:
- Add documentation for the `parseLogsContents` and its special cases
- Refactor the `parseLogsContent` output fields into protocols that
clearly define their optionality
- Add tests for the functionality of sampling when the format cannot be
determined
- Fix so that the error message is displayed for the case where
`fileContent == null` in `onChangeLogsSample`
### Risk Matrix
| Risk | Probability | Severity | Mitigation/Notes |
|---------------------------|-------------|----------|-------------------------|
| Due to the complexity of browser engines, some of them might produce
unexpected events, or events in unexpected order and we will confuse the
user with an incorrect error message. | Low | Low | Testing. |
---------
Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
💔 All backports failed
Manual backportTo create the backport manually run: Questions ?Please refer to the Backport tool documentation |
…90407) ## Summary This PR introduces a new graph in `Auto Import` called - `LogTypeDetection` Currently, only JSON/NDJSON formats are supported to be uploaded for building custom integrations. With this feature the capabilities to upload different log types is allowed. Although parsing of the new log types will be handled separately with a separate [issue.](elastic/security-team#9845) - The logs are initially parsed for JSON/NDJSON types in the UI side. - If it is not JSON/NDJSON format , then a new API `AnalyzeLogs` is triggered. - UI allows any type of logs to be uploaded. - Currently there is a server level content length restriction of `1MB` which needs to be extended. - For any log types other than JSON/NDJSON the handling graphs are not yet implemented , hence a `501 Not implemented` message appears. - The idea is to support `structured` , `csv` , `unstructured` syslog handling graphs. ### Checklist Delete any items that are not applicable to this PR. - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios ### For maintainers - [ ] This was checked for breaking API changes and was [labeled appropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) --------- Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com> Co-authored-by: Hanna Tamoudi <hanna.tamoudi@elastic.co> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 9f01f73) # Conflicts: # x-pack/plugins/integration_assistant/server/types.ts
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
…90407) (#192403) # Backport This will backport the following commits from `main` to `8.15`: - [[ AutoImport] Introduce automatic log type detection graph (#190407)](#190407) <!--- Backport version: 8.9.8 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Bharat Pasupula","email":"123897612+bhapas@users.noreply.github.com"},"sourceCommit":{"committedDate":"2024-08-27T16:14:27Z","message":"[ AutoImport] Introduce automatic log type detection graph (#190407)\n\n## Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called -\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are supported to be uploaded for\r\nbuilding custom integrations. With this feature the capabilities to\r\nupload different log types is allowed.\r\n\r\nAlthough parsing of the new log types will be handled separately with a\r\nseparate [issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n- The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n- If it is not JSON/NDJSON format , then a new API `AnalyzeLogs` is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n- Currently there is a server level content length restriction of `1MB`\r\nwhich needs to be extended.\r\n- For any log types other than JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a `501 Not implemented` message appears.\r\n- The idea is to support `structured` , `csv` , `unstructured` syslog\r\nhandling graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not applicable to this PR.\r\n\r\n- [x] [Unit or functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere updated or added to match the most common scenarios\r\n\r\n### For maintainers\r\n\r\n- [ ] This was checked for breaking API changes and was [labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6","branchLabelMapping":{"^v8.16.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["enhancement","release_note:feature","backport:prev-minor","v8.16.0","Team:Security-Scalability"],"number":190407,"url":"https://github.com/elastic/kibana/pull/190407","mergeCommit":{"message":"[ AutoImport] Introduce automatic log type detection graph (#190407)\n\n## Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called -\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are supported to be uploaded for\r\nbuilding custom integrations. With this feature the capabilities to\r\nupload different log types is allowed.\r\n\r\nAlthough parsing of the new log types will be handled separately with a\r\nseparate [issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n- The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n- If it is not JSON/NDJSON format , then a new API `AnalyzeLogs` is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n- Currently there is a server level content length restriction of `1MB`\r\nwhich needs to be extended.\r\n- For any log types other than JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a `501 Not implemented` message appears.\r\n- The idea is to support `structured` , `csv` , `unstructured` syslog\r\nhandling graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not applicable to this PR.\r\n\r\n- [x] [Unit or functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere updated or added to match the most common scenarios\r\n\r\n### For maintainers\r\n\r\n- [ ] This was checked for breaking API changes and was [labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v8.16.0","labelRegex":"^v8.16.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/190407","number":190407,"mergeCommit":{"message":"[ AutoImport] Introduce automatic log type detection graph (#190407)\n\n## Summary\r\n\r\nThis PR introduces a new graph in `Auto Import` called -\r\n`LogTypeDetection`\r\n\r\nCurrently, only JSON/NDJSON formats are supported to be uploaded for\r\nbuilding custom integrations. With this feature the capabilities to\r\nupload different log types is allowed.\r\n\r\nAlthough parsing of the new log types will be handled separately with a\r\nseparate [issue.](https://github.com/elastic/security-team/issues/9845)\r\n\r\n- The logs are initially parsed for JSON/NDJSON types in the UI side.\r\n- If it is not JSON/NDJSON format , then a new API `AnalyzeLogs` is\r\ntriggered.\r\n- UI allows any type of logs to be uploaded.\r\n- Currently there is a server level content length restriction of `1MB`\r\nwhich needs to be extended.\r\n- For any log types other than JSON/NDJSON the handling graphs are not\r\nyet implemented , hence a `501 Not implemented` message appears.\r\n- The idea is to support `structured` , `csv` , `unstructured` syslog\r\nhandling graphs.\r\n\r\n### Checklist\r\n\r\nDelete any items that are not applicable to this PR.\r\n\r\n- [x] [Unit or functional\r\ntests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)\r\nwere updated or added to match the most common scenarios\r\n\r\n### For maintainers\r\n\r\n- [ ] This was checked for breaking API changes and was [labeled\r\nappropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>\r\nCo-authored-by: Hanna Tamoudi <hanna.tamoudi@elastic.co>\r\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"9f01f735729b1cb21ebb419dbbe9a201a90918f6"}}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
### Release note
Display better error messages for issues with logs sample file upload in
Automatic Import.
## Summary
Previously the user would be told about parse issues that occur after
the file is successfully uploaded. However in the following scenarios
the operation would silently fail without displaying a user-visible
error:
1. When the file fails to upload (e.g. when it is too big).
2. When the upload operation is aborted, e.g. programmatically.
Additionally in the following scenario the generic `CAN_NOT_PARSE`
message was displayed:
3. When the file is uploaded but the browser runs out of memory when
trying to parse it.
Additionally, in the following scenario the `EMPTY` message was
displayed:
4. When the file is too big for the V8 engine (e.g. Chrome) to create a
string so upload process returns an empty string.
Additionally:
5. When the user switches from the invalid file (with an error
displayed) to the valid file, the error from the invalid file was
displayed during the analysis of the new file.
After the changes in this PR, the following error types would be
displayed in these cases, respectively:
1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: {reason}_
2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs
sample: An ongoing operation was aborted, typically with a call to
abort()._ (reason is provided by the browser)
3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_
5. No error would be displayed during the analysis.
This covers part of elastic/security-team#9844
though the issues were discovered separately.
Note that the fix in item 3 does not work in Firefox as it throws an
`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`
message will continue to be displayed in that case. The fix in item 4 is
only relevant for Chrome.
On a slightly different note, we provide the following improvements to
the log sampling functionality introduced in
elastic#190407:
- Add documentation for the `parseLogsContents` and its special cases
- Refactor the `parseLogsContent` output fields into protocols that
clearly define their optionality
- Add tests for the functionality of sampling when the format cannot be
determined
- Fix so that the error message is displayed for the case where
`fileContent == null` in `onChangeLogsSample`
### Risk Matrix
| Risk | Probability | Severity | Mitigation/Notes |
|---------------------------|-------------|----------|-------------------------|
| Due to the complexity of browser engines, some of them might produce
unexpected events, or events in unexpected order and we will confuse the
user with an incorrect error message. | Low | Low | Testing. |
---------
Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
(cherry picked from commit 16c2bfe)
…) (#192464) # Backport This will backport the following commits from `main` to `8.15`: - [[Automatic Import] Error handling when uploading a file (#191310)](#191310) <!--- Backport version: 8.9.8 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Ilya Nikokoshev","email":"ilya.nikokoshev@elastic.co"},"sourceCommit":{"committedDate":"2024-09-03T11:39:05Z","message":"[Automatic Import] Error handling when uploading a file (#191310)\n\n### Release note\r\n\r\nDisplay better error messages for issues with logs sample file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously the user would be told about parse issues that occur after\r\nthe file is successfully uploaded. However in the following scenarios\r\nthe operation would silently fail without displaying a user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g. when it is too big).\r\n2. When the upload operation is aborted, e.g. programmatically.\r\n\r\nAdditionally in the following scenario the generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the file is uploaded but the browser runs out of memory when\r\ntrying to parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY` message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8 engine (e.g. Chrome) to create a\r\nstring so upload process returns an empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from the invalid file (with an error\r\ndisplayed) to the valid file, the error from the invalid file was\r\ndisplayed during the analysis of the new file.\r\n\r\nAfter the changes in this PR, the following error types would be\r\ndisplayed in these cases, respectively:\r\n\r\n1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: An ongoing operation was aborted, typically with a call to\r\nabort()._ (reason is provided by the browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n5. No error would be displayed during the analysis.\r\n\r\nThis covers part of https://github.com/elastic/security-team/issues/9844\r\nthough the issues were discovered separately.\r\n\r\nNote that the fix in item 3 does not work in Firefox as it throws an\r\n`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be displayed in that case. The fix in item 4 is\r\nonly relevant for Chrome.\r\n\r\nOn a slightly different note, we provide the following improvements to\r\nthe log sampling functionality introduced in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add documentation for the `parseLogsContents` and its special cases\r\n- Refactor the `parseLogsContent` output fields into protocols that\r\nclearly define their optionality\r\n- Add tests for the functionality of sampling when the format cannot be\r\ndetermined\r\n- Fix so that the error message is displayed for the case where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes |\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n| Due to the complexity of browser engines, some of them might produce\r\nunexpected events, or events in unexpected order and we will confuse the\r\nuser with an incorrect error message. | Low | Low | Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7","branchLabelMapping":{"^v8.16.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:enhancement","enhancement","backport:prev-minor","v8.16.0","Team:Security-Scalability"],"number":191310,"url":"https://github.com/elastic/kibana/pull/191310","mergeCommit":{"message":"[Automatic Import] Error handling when uploading a file (#191310)\n\n### Release note\r\n\r\nDisplay better error messages for issues with logs sample file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously the user would be told about parse issues that occur after\r\nthe file is successfully uploaded. However in the following scenarios\r\nthe operation would silently fail without displaying a user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g. when it is too big).\r\n2. When the upload operation is aborted, e.g. programmatically.\r\n\r\nAdditionally in the following scenario the generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the file is uploaded but the browser runs out of memory when\r\ntrying to parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY` message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8 engine (e.g. Chrome) to create a\r\nstring so upload process returns an empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from the invalid file (with an error\r\ndisplayed) to the valid file, the error from the invalid file was\r\ndisplayed during the analysis of the new file.\r\n\r\nAfter the changes in this PR, the following error types would be\r\ndisplayed in these cases, respectively:\r\n\r\n1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: An ongoing operation was aborted, typically with a call to\r\nabort()._ (reason is provided by the browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n5. No error would be displayed during the analysis.\r\n\r\nThis covers part of https://github.com/elastic/security-team/issues/9844\r\nthough the issues were discovered separately.\r\n\r\nNote that the fix in item 3 does not work in Firefox as it throws an\r\n`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be displayed in that case. The fix in item 4 is\r\nonly relevant for Chrome.\r\n\r\nOn a slightly different note, we provide the following improvements to\r\nthe log sampling functionality introduced in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add documentation for the `parseLogsContents` and its special cases\r\n- Refactor the `parseLogsContent` output fields into protocols that\r\nclearly define their optionality\r\n- Add tests for the functionality of sampling when the format cannot be\r\ndetermined\r\n- Fix so that the error message is displayed for the case where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes |\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n| Due to the complexity of browser engines, some of them might produce\r\nunexpected events, or events in unexpected order and we will confuse the\r\nuser with an incorrect error message. | Low | Low | Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v8.16.0","labelRegex":"^v8.16.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/191310","number":191310,"mergeCommit":{"message":"[Automatic Import] Error handling when uploading a file (#191310)\n\n### Release note\r\n\r\nDisplay better error messages for issues with logs sample file upload in\r\nAutomatic Import.\r\n\r\n## Summary\r\n\r\nPreviously the user would be told about parse issues that occur after\r\nthe file is successfully uploaded. However in the following scenarios\r\nthe operation would silently fail without displaying a user-visible\r\nerror:\r\n\r\n1. When the file fails to upload (e.g. when it is too big).\r\n2. When the upload operation is aborted, e.g. programmatically.\r\n\r\nAdditionally in the following scenario the generic `CAN_NOT_PARSE`\r\nmessage was displayed:\r\n\r\n3. When the file is uploaded but the browser runs out of memory when\r\ntrying to parse it.\r\n\r\nAdditionally, in the following scenario the `EMPTY` message was\r\ndisplayed:\r\n\r\n4. When the file is too big for the V8 engine (e.g. Chrome) to create a\r\nstring so upload process returns an empty string.\r\n\r\nAdditionally:\r\n\r\n5. When the user switches from the invalid file (with an error\r\ndisplayed) to the valid file, the error from the invalid file was\r\ndisplayed during the analysis of the new file.\r\n\r\nAfter the changes in this PR, the following error types would be\r\ndisplayed in these cases, respectively:\r\n\r\n1. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: {reason}_\r\n2. `CAN_NOT_READ_WITH_REASON`: _An error occurred when reading logs\r\nsample: An ongoing operation was aborted, typically with a call to\r\nabort()._ (reason is provided by the browser)\r\n3. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n4. `TOO_LARGE_TO_PARSE`: _This logs sample file is too large to parse_\r\n5. No error would be displayed during the analysis.\r\n\r\nThis covers part of https://github.com/elastic/security-team/issues/9844\r\nthough the issues were discovered separately.\r\n\r\nNote that the fix in item 3 does not work in Firefox as it throws an\r\n`InternalError` rather than `RangeError`. A generic `CAN_NOT_PARSE`\r\nmessage will continue to be displayed in that case. The fix in item 4 is\r\nonly relevant for Chrome.\r\n\r\nOn a slightly different note, we provide the following improvements to\r\nthe log sampling functionality introduced in\r\nhttps://github.com//pull/190407:\r\n\r\n- Add documentation for the `parseLogsContents` and its special cases\r\n- Refactor the `parseLogsContent` output fields into protocols that\r\nclearly define their optionality\r\n- Add tests for the functionality of sampling when the format cannot be\r\ndetermined\r\n- Fix so that the error message is displayed for the case where\r\n`fileContent == null` in `onChangeLogsSample`\r\n\r\n### Risk Matrix\r\n\r\n| Risk | Probability | Severity | Mitigation/Notes |\r\n\r\n|---------------------------|-------------|----------|-------------------------|\r\n| Due to the complexity of browser engines, some of them might produce\r\nunexpected events, or events in unexpected order and we will confuse the\r\nuser with an incorrect error message. | Low | Low | Testing. |\r\n\r\n---------\r\n\r\nCo-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>","sha":"16c2bfe41e585fdcdd3e796166ac3e2a6367f1a7"}}]}] BACKPORT--> Co-authored-by: Ilya Nikokoshev <ilya.nikokoshev@elastic.co>
## Release Note Fixes a bug that is causing the deploy step to fail after a pipeline edit/save. ## Summary #190407 introduced a bug that deployment fails when a pipeline is edited and saved in the review step. The issue is that after the edit pipeline flow is executed the review step's result is overridden and `samplesFormat` is removed which if not present [the `useEffect` in Deploy step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41) fails. This PR fixes the issue by saving the `samplesFormat` that is present in the original result before the edit pipeline flow is executed there by having samplesFormat in the result.
…ic#194203) ## Release Note Fixes a bug that is causing the deploy step to fail after a pipeline edit/save. ## Summary elastic#190407 introduced a bug that deployment fails when a pipeline is edited and saved in the review step. The issue is that after the edit pipeline flow is executed the review step's result is overridden and `samplesFormat` is removed which if not present [the `useEffect` in Deploy step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41) fails. This PR fixes the issue by saving the `samplesFormat` that is present in the original result before the edit pipeline flow is executed there by having samplesFormat in the result. (cherry picked from commit 6366dc3)
…ic#194203) ## Release Note Fixes a bug that is causing the deploy step to fail after a pipeline edit/save. ## Summary elastic#190407 introduced a bug that deployment fails when a pipeline is edited and saved in the review step. The issue is that after the edit pipeline flow is executed the review step's result is overridden and `samplesFormat` is removed which if not present [the `useEffect` in Deploy step](https://github.com/elastic/kibana/blob/main/x-pack/plugins/integration_assistant/public/components/create_integration/create_integration_assistant/steps/deploy_step/use_deploy_integration.ts#L41) fails. This PR fixes the issue by saving the `samplesFormat` that is present in the original result before the edit pipeline flow is executed there by having samplesFormat in the result. (cherry picked from commit 6366dc3)
Release note
Adds a feature to identify log format type based on the sample input logs
Summary
This PR introduces a new graph in
Auto Importcalled -LogTypeDetectionCurrently, only JSON/NDJSON formats are supported to be uploaded for building custom integrations. With this feature the capabilities to upload different log types is allowed.
Although parsing of the new log types will be handled separately with a separate issue.
AnalyzeLogsis triggered.1MBwhich needs to be extended.501 Not implementedmessage appears.structured,csv,unstructuredsyslog handling graphs.Checklist
Delete any items that are not applicable to this PR.
For maintainers