chore(NA): improve performance of kibana dev/build distributable#256377
chore(NA): improve performance of kibana dev/build distributable#256377mistic merged 43 commits intoelastic:mainfrom
Conversation
|
Pinging @elastic/kibana-operations (Team:Operations) |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds Changes
Sequence Diagram(s)mermaid Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/dev/build/tasks/create_archives_task.ts (1)
33-79:⚠️ Potential issue | 🟠 Major
Promise.allSettledmay silently swallow archive creation failures.Using
Promise.allSettledwithout checking for rejected promises means that if any platform's archive creation fails, the error is silently ignored. The metrics loop at lines 82-94 will then process fewer archives than expected, and the task will appear successful despite partial failures.Consider using
Promise.allto propagate errors, or explicitly checking for rejected promises:🐛 Proposed fix to handle rejected promises
- await Promise.allSettled( + const results = await Promise.allSettled( config.getTargetPlatforms().map(async (platform) => { // ... existing code ... }) ); + + const failures = results.filter( + (r): r is PromiseRejectedResult => r.status === 'rejected' + ); + if (failures.length > 0) { + throw new AggregateError( + failures.map((f) => f.reason), + `Failed to create ${failures.length} archive(s)` + ); + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/dev/build/tasks/create_archives_task.ts` around lines 33 - 79, The task currently uses Promise.allSettled when creating archives which can silently ignore failures; update the code that invokes Promise.allSettled (the block mapping config.getTargetPlatforms()) to either use Promise.all so that any rejected promise will propagate and fail the task, or keep Promise.allSettled but inspect the returned results and throw an Error if any entry has status === 'rejected'; ensure this check happens before the later metrics loop that reads the archives array so partial successes are not treated as full success. Target symbols: Promise.allSettled (replace or wrap), config.getTargetPlatforms(), archives array, and the async handlers that call compressZip/compressTar.
🧹 Nitpick comments (3)
src/dev/build/tasks/generate_packages_optimized_assets.ts (1)
65-75: Consider using asyncbrotliCompressfor better event loop behavior.
brotliCompressSyncis a blocking operation that can cause noticeable event loop stalls, especially for larger files. While the concurrency limit helps, using the asyncutil.promisify(zlib.brotliCompress)variant could improve responsiveness without sacrificing parallelism.That said, for a build tool where throughput matters more than responsiveness, the synchronous version may be acceptable.
♻️ Optional: Use async brotliCompress
+import { promisify } from 'util'; +const brotliCompress = promisify(zlib.brotliCompress); + await asyncForEachWithLimit(compressFiles, PARALLEL_CONCURRENCY, async (file) => { const content = await Fsp.readFile(file); - const compressed = zlib.brotliCompressSync(content, { + const compressed = await brotliCompress(content, { params: { [zlib.constants.BROTLI_PARAM_QUALITY]: BROTLI_QUALITY, }, }); await Fsp.writeFile(file + '.br', compressed); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/dev/build/tasks/generate_packages_optimized_assets.ts` around lines 65 - 75, The current code in generate_packages_optimized_assets.ts uses the blocking zlib.brotliCompressSync inside the asyncForEachWithLimit loop (referenced by zlib.brotliCompressSync, asyncForEachWithLimit, BROTLI_QUALITY), which can stall the event loop; change to the async variant by creating a promisified brotliCompress (e.g., const brotliCompressAsync = util.promisify(zlib.brotliCompress)) or use zlib.brotliCompress directly with a Promise wrapper, then replace the synchronous call so you await brotliCompressAsync(content, { params: { [zlib.constants.BROTLI_PARAM_QUALITY]: BROTLI_QUALITY } }) and write the result to file + '.br'; ensure you add the necessary util import and adjust error handling as needed.src/dev/build/lib/scan_copy.ts (1)
17-18: Consider using bitwise OR for combining flags.While arithmetic addition (
+) works correctly here since the flag values are distinct powers of two, using bitwise OR (|) is the idiomatic approach for combining bit flags and makes the intent clearer:-const COPY_FLAGS = Fs.constants.COPYFILE_EXCL + Fs.constants.COPYFILE_FICLONE; +const COPY_FLAGS = Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/dev/build/lib/scan_copy.ts` around lines 17 - 18, Replace the arithmetic addition used to combine flags with a bitwise OR: update the constant COPY_FLAGS so it uses Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE (instead of +) to clearly and idiomatically combine the bit flags when setting up exclusive create + copy-on-write behavior.src/dev/build/lib/fs.ts (1)
235-240: ValidategzipLevelbounds beforecreateGzip
gzipLevelis user-provided through options; validating0..9upfront gives a clearer error than surfacing a zlib failure later.Suggested fix
export async function compressTar({ source, destination, gzipLevel = 6, createRootDirectory, rootDirectoryName, }: CompressTarOptions) { + if (!Number.isInteger(gzipLevel) || gzipLevel < 0 || gzipLevel > 9) { + throw new TypeError(`gzipLevel must be an integer between 0 and 9, got ${gzipLevel}`); + } + const folder = rootDirectoryName ?? basename(source);Also applies to: 259-259
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/dev/build/lib/fs.ts` around lines 235 - 240, The function compressTar accepts a user-supplied gzipLevel but never validates it before calling createGzip, which can surface obscure zlib errors; add an early check in compressTar to ensure gzipLevel is a number between 0 and 9 (inclusive) and throw a clear RangeError (e.g., "gzipLevel must be an integer between 0 and 9") if it is out of bounds or not an integer, then proceed to call createGzip with the validated value; apply the same validation wherever gzipLevel is used (referencing the gzipLevel parameter and the createGzip call).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 244-253: The bug is that using the nullish coalescing operator for
folder (const folder = rootDirectoryName ?? basename(source)) allows an empty
string to be used, producing absolute tar paths like "/file"; change the
fallback to treat empty string as missing (e.g., compute folder =
rootDirectoryName || basename(source) or use a conditional that checks
rootDirectoryName.length > 0) so folder is never empty, then keep the existing
tar map logic (header.name = folder + '/' + header.name) — reference symbols:
rootDirectoryName, folder, basename(source), tarFsPack(...).map, header.name.
In `@src/dev/license_checker/config.ts`:
- Around line 102-103: Update the license override entries in
src/dev/license_checker/config.ts for the two keys
'@swc/core-linux-x64-gnu@1.15.18' and '@swc/core-linux-x64-musl@1.15.18' to the
correct dual-license string "Apache-2.0 AND MIT" instead of just "MIT", and then
scan your lockfiles for any other resolved `@swc/core-`* variants (e.g.,
darwin-x64, darwin-arm64, linux-arm64-gnu, linux-arm64-musl, win32-x64-msvc,
etc.) and add corresponding explicit override entries with the same "Apache-2.0
AND MIT" value so every resolved binary variant is covered.
---
Outside diff comments:
In `@src/dev/build/tasks/create_archives_task.ts`:
- Around line 33-79: The task currently uses Promise.allSettled when creating
archives which can silently ignore failures; update the code that invokes
Promise.allSettled (the block mapping config.getTargetPlatforms()) to either use
Promise.all so that any rejected promise will propagate and fail the task, or
keep Promise.allSettled but inspect the returned results and throw an Error if
any entry has status === 'rejected'; ensure this check happens before the later
metrics loop that reads the archives array so partial successes are not treated
as full success. Target symbols: Promise.allSettled (replace or wrap),
config.getTargetPlatforms(), archives array, and the async handlers that call
compressZip/compressTar.
---
Nitpick comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 235-240: The function compressTar accepts a user-supplied
gzipLevel but never validates it before calling createGzip, which can surface
obscure zlib errors; add an early check in compressTar to ensure gzipLevel is a
number between 0 and 9 (inclusive) and throw a clear RangeError (e.g.,
"gzipLevel must be an integer between 0 and 9") if it is out of bounds or not an
integer, then proceed to call createGzip with the validated value; apply the
same validation wherever gzipLevel is used (referencing the gzipLevel parameter
and the createGzip call).
In `@src/dev/build/lib/scan_copy.ts`:
- Around line 17-18: Replace the arithmetic addition used to combine flags with
a bitwise OR: update the constant COPY_FLAGS so it uses
Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE (instead of +) to
clearly and idiomatically combine the bit flags when setting up exclusive create
+ copy-on-write behavior.
In `@src/dev/build/tasks/generate_packages_optimized_assets.ts`:
- Around line 65-75: The current code in generate_packages_optimized_assets.ts
uses the blocking zlib.brotliCompressSync inside the asyncForEachWithLimit loop
(referenced by zlib.brotliCompressSync, asyncForEachWithLimit, BROTLI_QUALITY),
which can stall the event loop; change to the async variant by creating a
promisified brotliCompress (e.g., const brotliCompressAsync =
util.promisify(zlib.brotliCompress)) or use zlib.brotliCompress directly with a
Promise wrapper, then replace the synchronous call so you await
brotliCompressAsync(content, { params: { [zlib.constants.BROTLI_PARAM_QUALITY]:
BROTLI_QUALITY } }) and write the result to file + '.br'; ensure you add the
necessary util import and adjust error handling as needed.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 24dd1af6-e59a-4e3d-b659-0dd3616b70cb
⛔ Files ignored due to path filters (1)
yarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (17)
.buildkite/scripts/steps/security/third_party_packages.txtpackage.jsonpackages/kbn-yarn-install-scripts/config.jsonrenovate.jsonsrc/dev/build/build_distributables.tssrc/dev/build/lib/fs.tssrc/dev/build/lib/integration_tests/fs.test.tssrc/dev/build/lib/scan_copy.tssrc/dev/build/lib/scan_delete.tssrc/dev/build/lib/tar_fs.d.tssrc/dev/build/tasks/build_packages_task.tssrc/dev/build/tasks/create_archives_task.tssrc/dev/build/tasks/create_cdn_assets_task.tssrc/dev/build/tasks/generate_packages_optimized_assets.tssrc/dev/build/tasks/os_packages/docker_generator/bundle_dockerfiles.tssrc/dev/license_checker/config.tstypings/tar_fs.d.ts
There was a problem hiding this comment.
🧹 Nitpick comments (1)
src/dev/build/lib/fs.ts (1)
237-265: Add an explicit return type tocompressTar.This helper is exported and now has a larger public surface. Declaring
Promise<number>makes the file-count contract obvious and prevents accidental widening later.Suggested change
export async function compressTar({ source, destination, gzipLevel = 6, createRootDirectory, rootDirectoryName, -}: CompressTarOptions) { +}: CompressTarOptions): Promise<number> {As per coding guidelines, "Prefer explicit return types for public APIs and exported functions in TypeScript."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/dev/build/lib/fs.ts` around lines 237 - 265, The exported helper compressTar currently has no explicit return type; update its declaration to explicitly return Promise<number> (reflecting the fileCount result) to make the public contract clear and prevent widening later—modify the compressTar function signature to include : Promise<number> and keep the implementation returning fileCount as before.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 237-265: The exported helper compressTar currently has no explicit
return type; update its declaration to explicitly return Promise<number>
(reflecting the fileCount result) to make the public contract clear and prevent
widening later—modify the compressTar function signature to include :
Promise<number> and keep the implementation returning fileCount as before.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 436a549d-3cf2-4f1a-a750-e5df04d48c1b
📒 Files selected for processing (1)
src/dev/build/lib/fs.ts
…ally, or add an exception to src/dev/yarn_deduplicate/index.ts and then commit the changes and push to your branch
⏳ Build in-progress, with failures
Failed CI Steps
Test Failures
History
|
|
Starting backport for target branches: 8.19, 9.2, 9.3 https://github.com/elastic/kibana/actions/runs/23228696240 |
💔 All backports failed
Manual backportTo create the backport manually run: Questions ?Please refer to the Backport tool documentation |
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 3750017) # Conflicts: # .buildkite/scripts/steps/security/third_party_packages.txt # package.json # renovate.json # yarn.lock
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 3750017) # Conflicts: # .buildkite/scripts/steps/security/third_party_packages.txt # package.json # renovate.json # yarn.lock
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 3750017) # Conflicts: # .buildkite/scripts/steps/security/third_party_packages.txt # package.json # renovate.json # src/dev/build/lib/fs.ts # src/dev/build/tasks/build_packages_task.ts # src/dev/build/tasks/generate_packages_optimized_assets.ts # yarn.lock
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
#256377) (#258261) # Backport This will backport the following commits from `main` to `9.3`: - [chore(NA): improve performance of kibana dev/build distributable (#256377)](#256377) <!--- Backport version: 10.2.0 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Tiago Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA): improve performance of kibana dev/build distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
#256377) (#258262) # Backport This will backport the following commits from `main` to `9.2`: - [chore(NA): improve performance of kibana dev/build distributable (#256377)](#256377) <!--- Backport version: 10.2.0 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Tiago Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA): improve performance of kibana dev/build distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…le (#256377) (#258263) # Backport This will backport the following commits from `main` to `8.19`: - [chore(NA): improve performance of kibana dev/build distributable (#256377)](#256377) <!--- Backport version: 10.2.0 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Tiago Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA): improve performance of kibana dev/build distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA): improve performance of kibana dev/build distributable (#256377)\n\nCloses https://github.com/elastic/kibana-operations/issues/468\n\nThis PR reduces the kibana distributable build time by ~2 minutes\nthrough targeted tool replacements and concurrency improvements.\n\n- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core (the one used at rspack) and lightningcss (also used at\nrspack) for asset optimization. Parallelized Brotli compression across\nfiles. Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from 9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit in\nbuild_packages_task.ts to cpus().length. Added missing default\nconcurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant\nreflink copies on supported filesystems.\n\n\n**Regarding new dependencies:**\n\n**Purpose:** The new dependencies are going to be used in tasks related\nwith optimizing distributable assets (code and css minification) and\nalso producing gzip files.\n**Justification:** The ones we had before (mainly terser) were not\nsuited for the job in terms of performance and the new ones are already\naligned with what we will want and use for the new optimizer being\nplanned itself.\n**Alternatives explored:** Those are the ones being used by RSPack\nitself which is the technology we want to use for bundling going\nforward. No others were explored.\n**Existing dependencies:** It does but mostly points to the legacy old\nknown dependencies used in the webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n* **Chores**\n* Added new JS/CSS tooling and packaging packages, install hook entry,\nand automated update rules.\n * Extended license allowlist.\n\n* **Performance**\n* Improved parallelism across build steps, capped to CPU count and added\npipeline limits.\n* Switched to per-file minification/compression, changed archive packing\nflow, and lowered gzip compression level for faster builds.\n\n* **Tests**\n* Added integration tests validating archive creation, contents, and\nfile counts.\n\n\n---------\n\nCo-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}] BACKPORT--> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…stic#256377) Closes elastic/kibana-operations#468 This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements. - **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9. - **Archive creation:** Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6. - **Concurrency tuning:** Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy. - **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems. **Regarding new dependencies:** **Purpose:** The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files. **Justification:** The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself. **Alternatives explored:** Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored. **Existing dependencies:** It does but mostly points to the legacy old known dependencies used in the webpack environment itself. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Chores** * Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules. * Extended license allowlist. * **Performance** * Improved parallelism across build steps, capped to CPU count and added pipeline limits. * Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds. * **Tests** * Added integration tests validating archive creation, contents, and file counts. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Closes https://github.com/elastic/kibana-operations/issues/468
This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements.
Regarding new dependencies:
Purpose: The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files.
Justification: The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself.
Alternatives explored: Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored.
Existing dependencies: It does but mostly points to the legacy old known dependencies used in the webpack environment itself.
Summary by CodeRabbit
Chores
Performance
Tests