Skip to content

chore(NA): improve performance of kibana dev/build distributable#256377

Merged
mistic merged 43 commits intoelastic:mainfrom
mistic:improve-kibana-build-step
Mar 18, 2026
Merged

chore(NA): improve performance of kibana dev/build distributable#256377
mistic merged 43 commits intoelastic:mainfrom
mistic:improve-kibana-build-step

Conversation

@mistic
Copy link
Copy Markdown
Contributor

@mistic mistic commented Mar 6, 2026

Closes https://github.com/elastic/kibana-operations/issues/468

This PR reduces the kibana distributable build time by ~2 minutes through targeted tool replacements and concurrency improvements.

  • JS/CSS minification: Replaced gulp-terser and gulp-postcss with @swc/core (the one used at rspack) and lightningcss (also used at rspack) for asset optimization. Parallelized Brotli compression across files. Lowered Brotli quality from 11 to 9.
  • Archive creation: Replaced archiver with tar-fs + native zlib.createGzip. Lowered gzip level from 9 to 6.
  • Concurrency tuning: Scaled asyncForEachWithLimit in build_packages_task.ts to cpus().length. Added missing default concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
  • Copy-on-Write: Enabled COPYFILE_FICLONE in scanCopy for instant reflink copies on supported filesystems.

Regarding new dependencies:

Purpose: The new dependencies are going to be used in tasks related with optimizing distributable assets (code and css minification) and also producing gzip files.
Justification: The ones we had before (mainly terser) were not suited for the job in terms of performance and the new ones are already aligned with what we will want and use for the new optimizer being planned itself.
Alternatives explored: Those are the ones being used by RSPack itself which is the technology we want to use for bundling going forward. No others were explored.
Existing dependencies: It does but mostly points to the legacy old known dependencies used in the webpack environment itself.

Summary by CodeRabbit

  • Chores

    • Added new JS/CSS tooling and packaging packages, install hook entry, and automated update rules.
    • Extended license allowlist.
  • Performance

    • Improved parallelism across build steps, capped to CPU count and added pipeline limits.
    • Switched to per-file minification/compression, changed archive packing flow, and lowered gzip compression level for faster builds.
  • Tests

    • Added integration tests validating archive creation, contents, and file counts.

@mistic mistic added chore release_note:skip Skip the PR/issue when compiling release notes backport:skip This PR does not require backporting v9.4.0 labels Mar 6, 2026
@mistic mistic requested a review from tylersmalley March 6, 2026 07:02
@mistic mistic added the Team:Operations Kibana-Operations Team label Mar 6, 2026
@mistic mistic marked this pull request as ready for review March 6, 2026 07:09
@mistic mistic requested review from a team as code owners March 6, 2026 07:09
@mistic mistic requested a review from elena-shostak March 6, 2026 07:09
@elasticmachine
Copy link
Copy Markdown
Contributor

Pinging @elastic/kibana-operations (Team:Operations)

@elastic elastic deleted a comment from elasticmachine Mar 6, 2026
@kibanamachine
Copy link
Copy Markdown
Contributor

kibanamachine commented Mar 6, 2026

Dependency Review Bot Analysis 🔍

Found 4 new third-party dependencies:

Package Version Vulnerabilities Health Score
tar-fs 3.1.1 🔴 C: 0, 🟠 H: 0, 🟡 M: 0, 🟢 L: 0 tar-fs
browserslist 4.28.0 🔴 C: 0, 🟠 H: 0, 🟡 M: 0, 🟢 L: 0 browserslist
lightningcss 1.31.1 🔴 C: 0, 🟠 H: 0, 🟡 M: 0, 🟢 L: 0 lightningcss
@swc/core 1.15.18 🔴 C: 0, 🟠 H: 0, 🟡 M: 0, 🟢 L: 0 @swc/core

Self Checklist

To help with the review, please update the PR description to address the following points for each new third-party dependency listed above:

  • Purpose: What is this dependency used for? Briefly explain its role in your changes.
  • Justification: Why is adding this dependency the best approach?
  • Alternatives explored: Were other options considered (e.g., using existing internal libraries/utilities, implementing the functionality directly)? If so, why was this dependency chosen over them?
  • Existing dependencies: Does Kibana have a dependency providing similar functionality? If so, why is the new one preferred?

Thank you for providing this information!

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 6, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds @swc/core, lightningcss, and tar-fs; replaces archiver-based tar creation with tar-fs + gzip and exposes gzipLevel; moves CSS/JS minification to per-file SWC/LightningCSS with per-file Brotli and CPU-based parallelism; introduces concurrency limits and related tests/typings.

Changes

Cohort / File(s) Summary
Dependency & CI config
package.json, .buildkite/scripts/steps/security/third_party_packages.txt, renovate.json, packages/kbn-yarn-install-scripts/config.json
Added @swc/core, lightningcss, and tar-fs to deps/devDeps and security list; added Renovate packageRules for those packages; added postinstall entry for @swc/core.
Tar implementation & typings
src/dev/build/lib/fs.ts, src/dev/build/lib/tar_fs.d.ts, typings/tar_fs.d.ts
Replaced archiver usage with tar-fs pack piped through zlib.createGzip; introduced gzipLevel option (default 6); file counting via tar headers; added TypeScript declarations for tar-fs.
Compression callsites
src/dev/build/tasks/create_archives_task.ts, src/dev/build/tasks/create_cdn_assets_task.ts, src/dev/build/tasks/os_packages/docker_generator/bundle_dockerfiles.ts
Updated compressTar invocations to use gzipLevel: 6 (replacing prior archiver gzip options/level 9).
Asset minification & optimization
src/dev/build/tasks/generate_packages_optimized_assets.ts
Replaced streaming/gulp pipeline with per-file processing: Lightning CSS for CSS, SWC for JS, per-file Brotli (brotliCompressSync); added CPU-based parallelism and concurrency limits; removed legacy pipelines.
Build flow & concurrency
src/dev/build/build_distributables.ts, src/dev/build/tasks/build_packages_task.ts
Reordered and deduplicated CleanExtraFilesFromModules; switched package processing to asyncForEachWithLimit(..., cpus().length).
Filesystem copy/delete improvements
src/dev/build/lib/scan_copy.ts, src/dev/build/lib/scan_delete.ts
Introduced COPY_FLAGS combining exclusive-create + reflink; increased/added concurrency limits (copy pipelines concurrency = 100; delete uses default 20 when unspecified).
License config
src/dev/license_checker/config.ts
Added 'Apache-2.0 AND MIT' to LICENSE_ALLOWED.
Tests
src/dev/build/lib/integration_tests/fs.test.ts
Added integration tests for compressTar: verifies .tar.gz extraction, root-directory wrapping, custom rootDirectoryName, and returned file count.
Install scripts config (duplicate mention)
packages/kbn-yarn-install-scripts/config.json
Added @swc/core entry with lifecycle postinstall, required: false, and a reason about native builds.

Sequence Diagram(s)

mermaid
sequenceDiagram
participant Source as "Source FS"
participant TarFs as "tar-fs pack"
participant Gzip as "zlib.createGzip"
participant Dest as "Destination FS (.tar.gz)"
Source->>TarFs: stream files & headers
TarFs->>TarFs: optionally rewrite header.name (createRootDirectory)
TarFs->>Gzip: pipe tar stream
Gzip->>Dest: write compressed output
TarFs-->>Source: emit 'entry' headers (used for file count)

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Suggested reviewers

  • tylersmalley
  • sabarasaba
🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 25.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title 'chore(NA): improve performance of kibana dev/build distributable' clearly and accurately summarizes the main objective of the changeset: improving performance in the Kibana build process.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
📝 Coding Plan
  • Generate coding plan for human review comments

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/dev/build/tasks/create_archives_task.ts (1)

33-79: ⚠️ Potential issue | 🟠 Major

Promise.allSettled may silently swallow archive creation failures.

Using Promise.allSettled without checking for rejected promises means that if any platform's archive creation fails, the error is silently ignored. The metrics loop at lines 82-94 will then process fewer archives than expected, and the task will appear successful despite partial failures.

Consider using Promise.all to propagate errors, or explicitly checking for rejected promises:

🐛 Proposed fix to handle rejected promises
-    await Promise.allSettled(
+    const results = await Promise.allSettled(
       config.getTargetPlatforms().map(async (platform) => {
         // ... existing code ...
       })
     );
+
+    const failures = results.filter(
+      (r): r is PromiseRejectedResult => r.status === 'rejected'
+    );
+    if (failures.length > 0) {
+      throw new AggregateError(
+        failures.map((f) => f.reason),
+        `Failed to create ${failures.length} archive(s)`
+      );
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/dev/build/tasks/create_archives_task.ts` around lines 33 - 79, The task
currently uses Promise.allSettled when creating archives which can silently
ignore failures; update the code that invokes Promise.allSettled (the block
mapping config.getTargetPlatforms()) to either use Promise.all so that any
rejected promise will propagate and fail the task, or keep Promise.allSettled
but inspect the returned results and throw an Error if any entry has status ===
'rejected'; ensure this check happens before the later metrics loop that reads
the archives array so partial successes are not treated as full success. Target
symbols: Promise.allSettled (replace or wrap), config.getTargetPlatforms(),
archives array, and the async handlers that call compressZip/compressTar.
🧹 Nitpick comments (3)
src/dev/build/tasks/generate_packages_optimized_assets.ts (1)

65-75: Consider using async brotliCompress for better event loop behavior.

brotliCompressSync is a blocking operation that can cause noticeable event loop stalls, especially for larger files. While the concurrency limit helps, using the async util.promisify(zlib.brotliCompress) variant could improve responsiveness without sacrificing parallelism.

That said, for a build tool where throughput matters more than responsiveness, the synchronous version may be acceptable.

♻️ Optional: Use async brotliCompress
+import { promisify } from 'util';
+const brotliCompress = promisify(zlib.brotliCompress);
+
 await asyncForEachWithLimit(compressFiles, PARALLEL_CONCURRENCY, async (file) => {
   const content = await Fsp.readFile(file);
-  const compressed = zlib.brotliCompressSync(content, {
+  const compressed = await brotliCompress(content, {
     params: {
       [zlib.constants.BROTLI_PARAM_QUALITY]: BROTLI_QUALITY,
     },
   });
   await Fsp.writeFile(file + '.br', compressed);
 });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/dev/build/tasks/generate_packages_optimized_assets.ts` around lines 65 -
75, The current code in generate_packages_optimized_assets.ts uses the blocking
zlib.brotliCompressSync inside the asyncForEachWithLimit loop (referenced by
zlib.brotliCompressSync, asyncForEachWithLimit, BROTLI_QUALITY), which can stall
the event loop; change to the async variant by creating a promisified
brotliCompress (e.g., const brotliCompressAsync =
util.promisify(zlib.brotliCompress)) or use zlib.brotliCompress directly with a
Promise wrapper, then replace the synchronous call so you await
brotliCompressAsync(content, { params: { [zlib.constants.BROTLI_PARAM_QUALITY]:
BROTLI_QUALITY } }) and write the result to file + '.br'; ensure you add the
necessary util import and adjust error handling as needed.
src/dev/build/lib/scan_copy.ts (1)

17-18: Consider using bitwise OR for combining flags.

While arithmetic addition (+) works correctly here since the flag values are distinct powers of two, using bitwise OR (|) is the idiomatic approach for combining bit flags and makes the intent clearer:

-const COPY_FLAGS = Fs.constants.COPYFILE_EXCL + Fs.constants.COPYFILE_FICLONE;
+const COPY_FLAGS = Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/dev/build/lib/scan_copy.ts` around lines 17 - 18, Replace the arithmetic
addition used to combine flags with a bitwise OR: update the constant COPY_FLAGS
so it uses Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE (instead
of +) to clearly and idiomatically combine the bit flags when setting up
exclusive create + copy-on-write behavior.
src/dev/build/lib/fs.ts (1)

235-240: Validate gzipLevel bounds before createGzip

gzipLevel is user-provided through options; validating 0..9 upfront gives a clearer error than surfacing a zlib failure later.

Suggested fix
 export async function compressTar({
   source,
   destination,
   gzipLevel = 6,
   createRootDirectory,
   rootDirectoryName,
 }: CompressTarOptions) {
+  if (!Number.isInteger(gzipLevel) || gzipLevel < 0 || gzipLevel > 9) {
+    throw new TypeError(`gzipLevel must be an integer between 0 and 9, got ${gzipLevel}`);
+  }
+
   const folder = rootDirectoryName ?? basename(source);

Also applies to: 259-259

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/dev/build/lib/fs.ts` around lines 235 - 240, The function compressTar
accepts a user-supplied gzipLevel but never validates it before calling
createGzip, which can surface obscure zlib errors; add an early check in
compressTar to ensure gzipLevel is a number between 0 and 9 (inclusive) and
throw a clear RangeError (e.g., "gzipLevel must be an integer between 0 and 9")
if it is out of bounds or not an integer, then proceed to call createGzip with
the validated value; apply the same validation wherever gzipLevel is used
(referencing the gzipLevel parameter and the createGzip call).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 244-253: The bug is that using the nullish coalescing operator for
folder (const folder = rootDirectoryName ?? basename(source)) allows an empty
string to be used, producing absolute tar paths like "/file"; change the
fallback to treat empty string as missing (e.g., compute folder =
rootDirectoryName || basename(source) or use a conditional that checks
rootDirectoryName.length > 0) so folder is never empty, then keep the existing
tar map logic (header.name = folder + '/' + header.name) — reference symbols:
rootDirectoryName, folder, basename(source), tarFsPack(...).map, header.name.

In `@src/dev/license_checker/config.ts`:
- Around line 102-103: Update the license override entries in
src/dev/license_checker/config.ts for the two keys
'@swc/core-linux-x64-gnu@1.15.18' and '@swc/core-linux-x64-musl@1.15.18' to the
correct dual-license string "Apache-2.0 AND MIT" instead of just "MIT", and then
scan your lockfiles for any other resolved `@swc/core-`* variants (e.g.,
darwin-x64, darwin-arm64, linux-arm64-gnu, linux-arm64-musl, win32-x64-msvc,
etc.) and add corresponding explicit override entries with the same "Apache-2.0
AND MIT" value so every resolved binary variant is covered.

---

Outside diff comments:
In `@src/dev/build/tasks/create_archives_task.ts`:
- Around line 33-79: The task currently uses Promise.allSettled when creating
archives which can silently ignore failures; update the code that invokes
Promise.allSettled (the block mapping config.getTargetPlatforms()) to either use
Promise.all so that any rejected promise will propagate and fail the task, or
keep Promise.allSettled but inspect the returned results and throw an Error if
any entry has status === 'rejected'; ensure this check happens before the later
metrics loop that reads the archives array so partial successes are not treated
as full success. Target symbols: Promise.allSettled (replace or wrap),
config.getTargetPlatforms(), archives array, and the async handlers that call
compressZip/compressTar.

---

Nitpick comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 235-240: The function compressTar accepts a user-supplied
gzipLevel but never validates it before calling createGzip, which can surface
obscure zlib errors; add an early check in compressTar to ensure gzipLevel is a
number between 0 and 9 (inclusive) and throw a clear RangeError (e.g.,
"gzipLevel must be an integer between 0 and 9") if it is out of bounds or not an
integer, then proceed to call createGzip with the validated value; apply the
same validation wherever gzipLevel is used (referencing the gzipLevel parameter
and the createGzip call).

In `@src/dev/build/lib/scan_copy.ts`:
- Around line 17-18: Replace the arithmetic addition used to combine flags with
a bitwise OR: update the constant COPY_FLAGS so it uses
Fs.constants.COPYFILE_EXCL | Fs.constants.COPYFILE_FICLONE (instead of +) to
clearly and idiomatically combine the bit flags when setting up exclusive create
+ copy-on-write behavior.

In `@src/dev/build/tasks/generate_packages_optimized_assets.ts`:
- Around line 65-75: The current code in generate_packages_optimized_assets.ts
uses the blocking zlib.brotliCompressSync inside the asyncForEachWithLimit loop
(referenced by zlib.brotliCompressSync, asyncForEachWithLimit, BROTLI_QUALITY),
which can stall the event loop; change to the async variant by creating a
promisified brotliCompress (e.g., const brotliCompressAsync =
util.promisify(zlib.brotliCompress)) or use zlib.brotliCompress directly with a
Promise wrapper, then replace the synchronous call so you await
brotliCompressAsync(content, { params: { [zlib.constants.BROTLI_PARAM_QUALITY]:
BROTLI_QUALITY } }) and write the result to file + '.br'; ensure you add the
necessary util import and adjust error handling as needed.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yml

Review profile: CHILL

Plan: Pro

Run ID: 24dd1af6-e59a-4e3d-b659-0dd3616b70cb

📥 Commits

Reviewing files that changed from the base of the PR and between f7091ba and aa803a9.

⛔ Files ignored due to path filters (1)
  • yarn.lock is excluded by !**/yarn.lock, !**/*.lock
📒 Files selected for processing (17)
  • .buildkite/scripts/steps/security/third_party_packages.txt
  • package.json
  • packages/kbn-yarn-install-scripts/config.json
  • renovate.json
  • src/dev/build/build_distributables.ts
  • src/dev/build/lib/fs.ts
  • src/dev/build/lib/integration_tests/fs.test.ts
  • src/dev/build/lib/scan_copy.ts
  • src/dev/build/lib/scan_delete.ts
  • src/dev/build/lib/tar_fs.d.ts
  • src/dev/build/tasks/build_packages_task.ts
  • src/dev/build/tasks/create_archives_task.ts
  • src/dev/build/tasks/create_cdn_assets_task.ts
  • src/dev/build/tasks/generate_packages_optimized_assets.ts
  • src/dev/build/tasks/os_packages/docker_generator/bundle_dockerfiles.ts
  • src/dev/license_checker/config.ts
  • typings/tar_fs.d.ts

Copy link
Copy Markdown
Contributor

@elena-shostak elena-shostak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
src/dev/build/lib/fs.ts (1)

237-265: Add an explicit return type to compressTar.

This helper is exported and now has a larger public surface. Declaring Promise<number> makes the file-count contract obvious and prevents accidental widening later.

Suggested change
 export async function compressTar({
   source,
   destination,
   gzipLevel = 6,
   createRootDirectory,
   rootDirectoryName,
-}: CompressTarOptions) {
+}: CompressTarOptions): Promise<number> {

As per coding guidelines, "Prefer explicit return types for public APIs and exported functions in TypeScript."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/dev/build/lib/fs.ts` around lines 237 - 265, The exported helper
compressTar currently has no explicit return type; update its declaration to
explicitly return Promise<number> (reflecting the fileCount result) to make the
public contract clear and prevent widening later—modify the compressTar function
signature to include : Promise<number> and keep the implementation returning
fileCount as before.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@src/dev/build/lib/fs.ts`:
- Around line 237-265: The exported helper compressTar currently has no explicit
return type; update its declaration to explicitly return Promise<number>
(reflecting the fileCount result) to make the public contract clear and prevent
widening later—modify the compressTar function signature to include :
Promise<number> and keep the implementation returning fileCount as before.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yml

Review profile: CHILL

Plan: Pro

Run ID: 436a549d-3cf2-4f1a-a750-e5df04d48c1b

📥 Commits

Reviewing files that changed from the base of the PR and between aa803a9 and 89198e6.

📒 Files selected for processing (1)
  • src/dev/build/lib/fs.ts

@elastic elastic deleted a comment from elasticmachine Mar 16, 2026
@elasticmachine
Copy link
Copy Markdown
Contributor

⏳ Build in-progress, with failures

Failed CI Steps

Test Failures

  • [job] [logs] Jest Tests #3 / SearchBar add filter
  • [job] [logs] FTR Configs #157 / dashboard app - group 2 full screen mode exits full screen mode when back button pressed
  • [job] [logs] FTR Configs #157 / dashboard app - group 2 full screen mode exits full screen mode when back button pressed
  • [job] [logs] Scout: [ platform / dashboard-stateful-classic ] plugin / local-stateful-classic - dashboard REST schema - Registered embeddable schemas have not changed

History

@mistic mistic merged commit 3750017 into elastic:main Mar 18, 2026
19 checks passed
@kibanamachine
Copy link
Copy Markdown
Contributor

Starting backport for target branches: 8.19, 9.2, 9.3

https://github.com/elastic/kibana/actions/runs/23228696240

@kibanamachine
Copy link
Copy Markdown
Contributor

💔 All backports failed

Status Branch Result
8.19 Backport failed because of merge conflicts

You might need to backport the following PRs to 8.19:
- Remove --no-experimental-require-module from node.options (#256379)
9.2 Backport failed because of merge conflicts
9.3 Backport failed because of merge conflicts

You might need to backport the following PRs to 9.3:
- Remove --no-experimental-require-module from node.options (#256379)

Manual backport

To create the backport manually run:

node scripts/backport --pr 256377

Questions ?

Please refer to the Backport tool documentation

mistic added a commit to mistic/kibana that referenced this pull request Mar 18, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.

**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
(cherry picked from commit 3750017)

# Conflicts:
#	.buildkite/scripts/steps/security/third_party_packages.txt
#	package.json
#	renovate.json
#	yarn.lock
mistic added a commit to mistic/kibana that referenced this pull request Mar 18, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.

**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
(cherry picked from commit 3750017)

# Conflicts:
#	.buildkite/scripts/steps/security/third_party_packages.txt
#	package.json
#	renovate.json
#	yarn.lock
mistic added a commit to mistic/kibana that referenced this pull request Mar 18, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.

**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
(cherry picked from commit 3750017)

# Conflicts:
#	.buildkite/scripts/steps/security/third_party_packages.txt
#	package.json
#	renovate.json
#	src/dev/build/lib/fs.ts
#	src/dev/build/tasks/build_packages_task.ts
#	src/dev/build/tasks/generate_packages_optimized_assets.ts
#	yarn.lock
@mistic
Copy link
Copy Markdown
Contributor Author

mistic commented Mar 18, 2026

💚 All backports created successfully

Status Branch Result
9.3
9.2
8.19

Note: Successful backport PRs will be merged automatically after passing CI.

Questions ?

Please refer to the Backport tool documentation

mistic added a commit that referenced this pull request Mar 18, 2026
#256377) (#258261)

# Backport

This will backport the following commits from `main` to `9.3`:
- [chore(NA): improve performance of kibana dev/build distributable
(#256377)](#256377)

<!--- Backport version: 10.2.0 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Tiago
Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA):
improve performance of kibana dev/build
distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}]
BACKPORT-->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
mistic added a commit that referenced this pull request Mar 18, 2026
#256377) (#258262)

# Backport

This will backport the following commits from `main` to `9.2`:
- [chore(NA): improve performance of kibana dev/build distributable
(#256377)](#256377)

<!--- Backport version: 10.2.0 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Tiago
Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA):
improve performance of kibana dev/build
distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}]
BACKPORT-->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
mistic added a commit that referenced this pull request Mar 18, 2026
…le (#256377) (#258263)

# Backport

This will backport the following commits from `main` to `8.19`:
- [chore(NA): improve performance of kibana dev/build distributable
(#256377)](#256377)

<!--- Backport version: 10.2.0 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Tiago
Costa","email":"tiago.costa@elastic.co"},"sourceCommit":{"committedDate":"2026-03-18T04:10:36Z","message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e","branchLabelMapping":{"^v9.4.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["chore","Team:Operations","release_note:skip","backport:all-open","v9.4.0"],"title":"chore(NA):
improve performance of kibana dev/build
distributable","number":256377,"url":"https://github.com/elastic/kibana/pull/256377","mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.4.0","branchLabelMappingKey":"^v9.4.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/256377","number":256377,"mergeCommit":{"message":"chore(NA):
improve performance of kibana dev/build distributable
(#256377)\n\nCloses
https://github.com/elastic/kibana-operations/issues/468\n\nThis PR
reduces the kibana distributable build time by ~2 minutes\nthrough
targeted tool replacements and concurrency improvements.\n\n- **JS/CSS
minification:** Replaced gulp-terser and gulp-postcss with\n@swc/core
(the one used at rspack) and lightningcss (also used at\nrspack) for
asset optimization. Parallelized Brotli compression across\nfiles.
Lowered Brotli quality from 11 to 9.\n- **Archive creation:** Replaced
archiver with tar-fs + native\nzlib.createGzip. Lowered gzip level from
9 to 6.\n- **Concurrency tuning:** Scaled asyncForEachWithLimit
in\nbuild_packages_task.ts to cpus().length. Added missing
default\nconcurrency to scanDelete. Added mergeMap concurrency caps in
scanCopy.\n- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for
instant\nreflink copies on supported filesystems.\n\n\n**Regarding new
dependencies:**\n\n**Purpose:** The new dependencies are going to be
used in tasks related\nwith optimizing distributable assets (code and
css minification) and\nalso producing gzip files.\n**Justification:**
The ones we had before (mainly terser) were not\nsuited for the job in
terms of performance and the new ones are already\naligned with what we
will want and use for the new optimizer being\nplanned
itself.\n**Alternatives explored:** Those are the ones being used by
RSPack\nitself which is the technology we want to use for bundling
going\nforward. No others were explored.\n**Existing dependencies:** It
does but mostly points to the legacy old\nknown dependencies used in the
webpack environment itself.\n\n\n## Summary by CodeRabbit\n\n*
**Chores**\n* Added new JS/CSS tooling and packaging packages, install
hook entry,\nand automated update rules.\n * Extended license
allowlist.\n\n* **Performance**\n* Improved parallelism across build
steps, capped to CPU count and added\npipeline limits.\n* Switched to
per-file minification/compression, changed archive packing\nflow, and
lowered gzip compression level for faster builds.\n\n* **Tests**\n*
Added integration tests validating archive creation, contents, and\nfile
counts.\n\n\n---------\n\nCo-authored-by: kibanamachine
<42973632+kibanamachine@users.noreply.github.com>","sha":"37500177b884be067bc81ba0981a3e295b87436e"}}]}]
BACKPORT-->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
szwarckonrad pushed a commit to szwarckonrad/kibana that referenced this pull request Mar 18, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.


**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
qn895 pushed a commit to qn895/kibana that referenced this pull request Mar 18, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.


**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Mar 26, 2026
…stic#256377)

Closes elastic/kibana-operations#468

This PR reduces the kibana distributable build time by ~2 minutes
through targeted tool replacements and concurrency improvements.

- **JS/CSS minification:** Replaced gulp-terser and gulp-postcss with
@swc/core (the one used at rspack) and lightningcss (also used at
rspack) for asset optimization. Parallelized Brotli compression across
files. Lowered Brotli quality from 11 to 9.
- **Archive creation:** Replaced archiver with tar-fs + native
zlib.createGzip. Lowered gzip level from 9 to 6.
- **Concurrency tuning:** Scaled asyncForEachWithLimit in
build_packages_task.ts to cpus().length. Added missing default
concurrency to scanDelete. Added mergeMap concurrency caps in scanCopy.
- **Copy-on-Write:** Enabled COPYFILE_FICLONE in scanCopy for instant
reflink copies on supported filesystems.


**Regarding new dependencies:**

**Purpose:** The new dependencies are going to be used in tasks related
with optimizing distributable assets (code and css minification) and
also producing gzip files.
**Justification:** The ones we had before (mainly terser) were not
suited for the job in terms of performance and the new ones are already
aligned with what we will want and use for the new optimizer being
planned itself.
**Alternatives explored:** Those are the ones being used by RSPack
itself which is the technology we want to use for bundling going
forward. No others were explored.
**Existing dependencies:** It does but mostly points to the legacy old
known dependencies used in the webpack environment itself.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Added new JS/CSS tooling and packaging packages, install hook entry,
and automated update rules.
  * Extended license allowlist.

* **Performance**
* Improved parallelism across build steps, capped to CPU count and added
pipeline limits.
* Switched to per-file minification/compression, changed archive packing
flow, and lowered gzip compression level for faster builds.

* **Tests**
* Added integration tests validating archive creation, contents, and
file counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:all-open Backport to all branches that could still receive a release chore release_note:skip Skip the PR/issue when compiling release notes Team:Operations Kibana-Operations Team v8.19.13 v9.2.7 v9.3.2 v9.4.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants