Skip to content

Commit

Permalink
doc: update zlib doc
Browse files Browse the repository at this point in the history
Just some general improvements to zlib docs and examples

Signed-off-by: James M Snell <[email protected]>

PR-URL: #31665
Reviewed-By: Anna Henningsen <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
  • Loading branch information
jasnell authored and codebytere committed Feb 27, 2020
1 parent 8516602 commit e3258fd
Showing 1 changed file with 144 additions and 46 deletions.
190 changes: 144 additions & 46 deletions doc/api/zlib.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,60 +4,121 @@

> Stability: 2 - Stable
The `zlib` module provides compression functionality implemented using Gzip and
Deflate/Inflate, as well as Brotli. It can be accessed using:
The `zlib` module provides compression functionality implemented using Gzip,
Deflate/Inflate, and Brotli.

To access it:

```js
const zlib = require('zlib');
```

Compression and decompression are built around the Node.js [Streams API][].

Compressing or decompressing a stream (such as a file) can be accomplished by
piping the source stream data through a `zlib` stream into a destination stream:
piping the source stream through a `zlib` `Transform` stream into a destination
stream:

```js
const gzip = zlib.createGzip();
const fs = require('fs');
const inp = fs.createReadStream('input.txt');
const out = fs.createWriteStream('input.txt.gz');

inp.pipe(gzip)
.on('error', () => {
// handle error
})
.pipe(out)
.on('error', () => {
// handle error
const { createGzip } = require('zlib');
const { pipeline } = require('stream');
const {
createReadStream,
createWriteStream
} = require('fs');

const gzip = createGzip();
const source = createReadStream('input.txt');
const destination = createWriteStream('input.txt.gz');

pipeline(source, gzip, destination, (err) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
});

// Or, Promisified

const { promisify } = require('util');
const pipe = promisify(pipeline);

async function do_gzip(input, output) {
const gzip = createGzip();
const source = createReadStream(input);
const destination = createWriteStream(output);
await pipe(source, gzip, destination);
}

do_gzip('input.txt', 'input.txt.gz')
.catch((err) => {
console.error('An error occurred:', err);
process.exitCode = 1;
});
```

It is also possible to compress or decompress data in a single step:

```js
const { deflate, unzip } = require('zlib');

const input = '.................................';
zlib.deflate(input, (err, buffer) => {
if (!err) {
console.log(buffer.toString('base64'));
} else {
// handle error
deflate(input, (err, buffer) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString('base64'));
});

const buffer = Buffer.from('eJzT0yMAAGTvBe8=', 'base64');
zlib.unzip(buffer, (err, buffer) => {
if (!err) {
console.log(buffer.toString());
} else {
// handle error
unzip(buffer, (err, buffer) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString());
});

// Or, Promisified

const { promisify } = require('util');
const do_unzip = promisify(unzip);

do_unzip(buffer)
.then((buf) => console.log(buf.toString()))
.catch((err) => {
console.error('An error occurred:', err);
process.exitCode = 1;
});
```

## Threadpool Usage
## Threadpool Usage and Performance Considerations

All `zlib` APIs, except those that are explicitly synchronous, use the Node.js
internal threadpool. This can lead to surprising effects and performance
limitations in some applications.

All zlib APIs, except those that are explicitly synchronous, use libuv's
threadpool. This can lead to surprising effects in some applications, such as
subpar performance (which can be mitigated by adjusting the [pool size][])
and/or unrecoverable and catastrophic memory fragmentation.
Creating and using a large number of zlib objects simultaneously can cause
significant memory fragmentation.

```js
const zlib = require('zlib');

const payload = Buffer.from('This is some data');

// WARNING: DO NOT DO THIS!
for (let i = 0; i < 30000; ++i) {
zlib.deflate(payload, (err, buffer) => {});
}
```

In the preceding example, 30,000 deflate instances are created concurrently.
Because of how some operating systems handle memory allocation and
deallocation, this may lead to to significant memory fragmentation.

It is strongly recommended that the results of compression
operations be cached to avoid duplication of effort.

## Compressing HTTP requests and responses

Expand All @@ -80,26 +141,35 @@ tradeoffs involved in `zlib` usage.
const zlib = require('zlib');
const http = require('http');
const fs = require('fs');
const { pipeline } = require('stream');

const request = http.get({ host: 'example.com',
path: '/',
port: 80,
headers: { 'Accept-Encoding': 'br,gzip,deflate' } });
request.on('response', (response) => {
const output = fs.createWriteStream('example.com_index.html');

const onError = (err) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
};

switch (response.headers['content-encoding']) {
case 'br':
response.pipe(zlib.createBrotliDecompress()).pipe(output);
pipeline(response, zlib.createBrotliDecompress(), output, onError);
break;
// Or, just use zlib.createUnzip() to handle both of the following cases:
case 'gzip':
response.pipe(zlib.createGunzip()).pipe(output);
pipeline(response, zlib.createGunzip(), output, onError);
break;
case 'deflate':
response.pipe(zlib.createInflate()).pipe(output);
pipeline(response, zlib.createInflate(), outout, onError);
break;
default:
response.pipe(output);
pipeline(response, output, onError);
break;
}
});
Expand All @@ -112,6 +182,8 @@ request.on('response', (response) => {
const zlib = require('zlib');
const http = require('http');
const fs = require('fs');
const { pipeline } = require('stream');

http.createServer((request, response) => {
const raw = fs.createReadStream('index.html');
// Store both a compressed and an uncompressed version of the resource.
Expand All @@ -121,20 +193,32 @@ http.createServer((request, response) => {
acceptEncoding = '';
}

const onError = (err) => {
if (err) {
// If an error occurs, there's not much we can do because
// the server has already sent the 200 response code and
// some amount of data has already been sent to the client.
// The best we can do is terminate the response immediately
// and log the error.
response.end();
console.error('An error occurred:', err);
}
};

// Note: This is not a conformant accept-encoding parser.
// See https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3
if (/\bdeflate\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'deflate' });
raw.pipe(zlib.createDeflate()).pipe(response);
pipeline(raw, zlib.createDeflate(), response, onError);
} else if (/\bgzip\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'gzip' });
raw.pipe(zlib.createGzip()).pipe(response);
pipeline(raw, zlib.createGzip(), response, onError);
} else if (/\bbr\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'br' });
raw.pipe(zlib.createBrotliCompress()).pipe(response);
pipeline(raw, zlib.createBrotliCompress(), response, onError);
} else {
response.writeHead(200, {});
raw.pipe(response);
pipeline(raw, response, onError);
}
}).listen(1337);
```
Expand All @@ -154,11 +238,11 @@ zlib.unzip(
// For Brotli, the equivalent is zlib.constants.BROTLI_OPERATION_FLUSH.
{ finishFlush: zlib.constants.Z_SYNC_FLUSH },
(err, buffer) => {
if (!err) {
console.log(buffer.toString());
} else {
// handle error
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString());
});
```

Expand Down Expand Up @@ -234,14 +318,28 @@ HTTP response to the client:
```js
const zlib = require('zlib');
const http = require('http');
const { pipeline } = require('stream');

http.createServer((request, response) => {
// For the sake of simplicity, the Accept-Encoding checks are omitted.
response.writeHead(200, { 'content-encoding': 'gzip' });
const output = zlib.createGzip();
output.pipe(response);
let i;

pipeline(output, response, (err) => {
if (err) {
// If an error occurs, there's not much we can do because
// the server has already sent the 200 response code and
// some amount of data has already been sent to the client.
// The best we can do is terminate the response immediately
// and log the error.
clearInterval(i);
response.end();
console.error('An error occurred:', err);
}
});

setInterval(() => {
i = setInterval(() => {
output.write(`The current time is ${Date()}\n`, () => {
// The data has been passed to zlib, but the compression algorithm may
// have decided to buffer the data for more efficient compression.
Expand Down Expand Up @@ -399,7 +497,7 @@ changes:

<!--type=misc-->

Each zlib-based class takes an `options` object. All options are optional.
Each zlib-based class takes an `options` object. No options are required.

Some options are only relevant when compressing and are
ignored by the decompression classes.
Expand Down Expand Up @@ -1058,6 +1156,6 @@ Decompress a chunk of data with [`Unzip`][].
[Brotli parameters]: #zlib_brotli_constants
[Memory Usage Tuning]: #zlib_memory_usage_tuning
[RFC 7932]: https://www.rfc-editor.org/rfc/rfc7932.txt
[pool size]: cli.html#cli_uv_threadpool_size_size
[Streams API]: stream.md
[zlib documentation]: https://zlib.net/manual.html#Constants
[zlib.createGzip example]: #zlib_zlib

0 comments on commit e3258fd

Please sign in to comment.