Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(json): lint @std/json docs #4798

Merged
merged 14 commits into from
Jun 18, 2024
1 change: 1 addition & 0 deletions _tools/check_docs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ const ENTRY_POINTS = [
"../http/mod.ts",
"../ini/mod.ts",
"../internal/mod.ts",
"../json/mod.ts",
"../jsonc/mod.ts",
"../media_types/mod.ts",
"../msgpack/mod.ts",
Expand Down
86 changes: 72 additions & 14 deletions json/concatenated_json_parse_stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,32 +12,90 @@ const primitives = new Map(
);

/**
* Stream to parse {@link https://en.wikipedia.org/wiki/JSON_streaming#Concatenated_JSON|Concatenated JSON}.
* Stream to parse
* {@link https://en.wikipedia.org/wiki/JSON_streaming#Concatenated_JSON | Concatenated JSON}.
*
* @example Usage
*
* @example
* ```ts
* import { ConcatenatedJsonParseStream } from "@std/json/concatenated-json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const url = "@std/json/testdata/test.concatenated-json";
* const { body } = await fetch(url);
*
* const readable = body!
* .pipeThrough(new TextDecoderStream()) // convert Uint8Array to string
* .pipeThrough(new ConcatenatedJsonParseStream()); // parse Concatenated JSON
* const stream = ReadableStream.from([
* `{"foo":"bar"}`,
* `{"baz":100}`,
* ]).pipeThrough(new ConcatenatedJsonParseStream());
*
* for await (const data of readable) {
* console.log(data);
* }
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 },
* ]);
* ```
*/
export class ConcatenatedJsonParseStream
implements TransformStream<string, JsonValue> {
/** A writable stream of byte data. */
// TODO(iuioiua): Investigate why this class is implemented differently to the other JSON streams.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JSONParseStream supposes every chunk is valid JSON string which represents a single JSON object. ConcatenatedJsonParseStream handles stream of strings where each chunk can be incomplete fragments of JSON strings.

This difference exists because in NDJSON or JSONLines, each line contains single JSON object. So we can use TextLineStream to split them and pass the result to JSONParseStream. On the other hand, in Concatenated JSON, there's no explicit delimiter of JSONs. So the class needs to maintain the internal parsing state of JSONs

/**
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went with this for now. I didn't bother with a proper example because this, as a public property, seems like a mistake/fault, and may get removed in the future.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To my understanding readable and writable are necessary to be exposed for a TransformStream instance working as a TransformStream.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just meant how this implements TransformStream rather than extending it.

* A writable stream of byte data.
*
* @example Usage
* ```ts
* import { ConcatenatedJsonParseStream } from "@std/json/concatenated-json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([
* `{"foo":"bar"}`,
* `{"baz":100}`,
* ]).pipeThrough(new ConcatenatedJsonParseStream());
*
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 },
* ]);
* ```
*/
readonly writable: WritableStream<string>;
/** A readable stream of byte data. */
// TODO(iuioiua): Investigate why this class is implemented differently to the other JSON streams.
/**
* A readable stream of byte data.
*
* @example Usage
* ```ts
* import { ConcatenatedJsonParseStream } from "@std/json/concatenated-json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([
* `{"foo":"bar"}`,
* `{"baz":100}`,
* ]).pipeThrough(new ConcatenatedJsonParseStream());
*
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 },
* ]);
* ```
*/
readonly readable: ReadableStream<JsonValue>;

/** Constructs a new instance. */
/**
* Constructs a new instance.
*
* @example Usage
* ```ts
* import { ConcatenatedJsonParseStream } from "@std/json/concatenated-json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([
* `{"foo":"bar"}`,
* `{"baz":100}`,
* ]).pipeThrough(new ConcatenatedJsonParseStream());
*
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 },
* ]);
* ```
*/
constructor({ writableStrategy, readableStrategy }: ParseStreamOptions = {}) {
const { writable, readable } = toTransformStream(
this.#concatenatedJSONIterator,
Expand Down
92 changes: 64 additions & 28 deletions json/json_parse_stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,48 +17,84 @@ function isBrankString(str: string) {
* {@link https://www.rfc-editor.org/rfc/rfc7464.html | JSON Text Sequences}.
* Chunks consisting of spaces, tab characters, or newline characters will be ignored.
*
* @example
* parse JSON lines or NDJSON
* @example Basic usage
*
* ```ts
* import { TextLineStream } from "@std/streams/text-line-stream";
* import { JsonParseStream } from "@std/json/json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const url = "@std/json/testdata/test.jsonl";
* const { body } = await fetch(url);
*
* const readable = body!
* .pipeThrough(new TextDecoderStream()) // convert Uint8Array to string
* .pipeThrough(new TextLineStream()) // transform into a stream where each chunk is divided by a newline
* .pipeThrough(new JsonParseStream()); // parse each chunk as JSON
kt3k marked this conversation as resolved.
Show resolved Hide resolved
* const stream = ReadableStream.from([
* `{"foo":"bar"}\n`,
* `{"baz":100}\n`
* ]).pipeThrough(new JsonParseStream());
*
* for await (const data of readable) {
* console.log(data);
* }
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 }
* ]);
* ```
*
* @example
* parse JSON Text Sequences
* @example parse JSON lines or NDJSON from a file
* ```ts
* import { TextDelimiterStream } from "@std/streams/text-delimiter-stream";
* import { TextLineStream } from "@std/streams/text-line-stream";
* import { JsonParseStream } from "@std/json/json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const url =
* "@std/json/testdata/test.json-seq";
* const { body } = await fetch(url);
* const file = await Deno.open("json/testdata/test.jsonl");
*
* const delimiter = "\x1E";
* const readable = body!
* .pipeThrough(new TextDecoderStream())
* .pipeThrough(new TextDelimiterStream(delimiter)) // transform into a stream where each chunk is divided by a delimiter
* .pipeThrough(new JsonParseStream());
* const readable = file.readable
* .pipeThrough(new TextDecoderStream()) // convert Uint8Array to string
* .pipeThrough(new TextLineStream()) // transform into a stream where each chunk is divided by a newline
* .pipeThrough(new JsonParseStream()); // parse each chunk as JSON
*
* for await (const data of readable) {
* console.log(data);
* }
* assertEquals(await Array.fromAsync(readable), [
* {"hello": "world"},
* ["👋", "👋", "👋"],
* {"deno": "🦕"},
* ]);
* ```
*/
export class JsonParseStream extends TransformStream<string, JsonValue> {
/** Constructs new instance. */
/**
* Constructs new instance.
*
* @example Basic usage
*
* ```ts
* import { JsonParseStream } from "@std/json/json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([
* `{"foo":"bar"}`,
* `{"baz":100}`,
* ]).pipeThrough(new JsonParseStream());
*
* assertEquals(await Array.fromAsync(stream), [
* { foo: "bar" },
* { baz: 100 },
* ]);
* ```
*
* @example parse JSON lines or NDJSON from a file
* ```ts
* import { TextLineStream } from "@std/streams/text-line-stream";
* import { JsonParseStream } from "@std/json/json-parse-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const file = await Deno.open("json/testdata/test.jsonl");
*
* const readable = file.readable
* .pipeThrough(new TextDecoderStream()) // convert Uint8Array to string
* .pipeThrough(new TextLineStream()) // transform into a stream where each chunk is divided by a newline
* .pipeThrough(new JsonParseStream()); // parse each chunk as JSON
*
* assertEquals(await Array.fromAsync(readable), [
* {"hello": "world"},
* ["👋", "👋", "👋"],
* {"deno": "🦕"},
* ]);
* ```
*/
constructor({ writableStrategy, readableStrategy }: ParseStreamOptions = {}) {
super(
{
Expand Down
67 changes: 44 additions & 23 deletions json/json_stringify_stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -39,45 +39,50 @@ export interface StringifyStreamOptions {
*
* You can optionally specify a prefix and suffix for each chunk. The default prefix is `""` and the default suffix is `"\n"`.
*
* @example
* @example Basic usage
*
* ```ts
* import { JsonStringifyStream } from "@std/json/json-stringify-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const file = await Deno.open("./tmp.jsonl", { create: true, write: true });
* const stream = ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream());
*
* ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream()) // convert to JSON lines (ndjson)
* .pipeThrough(new TextEncoderStream()) // convert a string to a Uint8Array
* .pipeTo(file.writable)
* .then(() => console.log("write success"));
* assertEquals(await Array.fromAsync(stream), [
* `{"foo":"bar"}\n`,
* `{"baz":100}\n`
* ]);
* ```
*
* @example
* To convert to [JSON Text Sequences](https://www.rfc-editor.org/rfc/rfc7464.html), set the
* prefix to the delimiter "\x1E" as options.
* @example Stringify stream of JSON text sequences
*
* Set `options.prefix` to `\x1E` to stringify
* {@linkcode https://www.rfc-editor.org/rfc/rfc7464.html | JSON Text Sequences}.
*
* ```ts
* import { JsonStringifyStream } from "@std/json/json-stringify-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const file = await Deno.open("./tmp.jsonl", { create: true, write: true });
* const stream = ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream({ prefix: "\x1E", suffix: "\n" }));
*
* ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream({ prefix: "\x1E", suffix: "\n" })) // convert to JSON Text Sequences
* .pipeThrough(new TextEncoderStream())
* .pipeTo(file.writable)
* .then(() => console.log("write success"));
* assertEquals(await Array.fromAsync(stream), [
* `\x1E{"foo":"bar"}\n`,
* `\x1E{"baz":100}\n`
* ]);
* ```
*
* @example
* If you want to stream [JSON lines](https://jsonlines.org/) from the server:
* ```ts
* @example Stringify JSON lines from a server
*
* ```ts no-eval no-assert
* import { JsonStringifyStream } from "@std/json/json-stringify-stream";
*
* // A server that streams one line of JSON every second
* Deno.serve(() => {
* let intervalId: number | undefined;
* const readable = new ReadableStream({
* start(controller) {
* // enqueue data once per second
* // Enqueue data once per second
* intervalId = setInterval(() => {
* controller.enqueue({ now: new Date() });
* }, 1000);
Expand All @@ -88,15 +93,31 @@ export interface StringifyStreamOptions {
* });
*
* const body = readable
* .pipeThrough(new JsonStringifyStream()) // convert data to JSON lines
* .pipeThrough(new TextEncoderStream()); // convert a string to a Uint8Array
* .pipeThrough(new JsonStringifyStream()) // Convert data to JSON lines
* .pipeThrough(new TextEncoderStream()); // Convert a string to a Uint8Array
*
* return new Response(body);
* });
* ```
*/
export class JsonStringifyStream extends TransformStream<unknown, string> {
/** Constructs new instance. */
/**
* Constructs new instance.
*
* @example Usage
* ```ts
* import { JsonStringifyStream } from "@std/json/json-stringify-stream";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream());
*
* assertEquals(await Array.fromAsync(stream), [
* `{"foo":"bar"}\n`,
* `{"baz":100}\n`
* ]);
* ```
*/
constructor({
prefix = "",
suffix = "\n",
Expand Down
16 changes: 15 additions & 1 deletion json/mod.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,20 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.

/** Utilities for parsing streaming JSON data.
/**
* Utilities for parsing streaming JSON data.
*
* ```ts
* import { JsonStringifyStream } from "@std/json";
* import { assertEquals } from "@std/assert/assert-equals";
*
* const stream = ReadableStream.from([{ foo: "bar" }, { baz: 100 }])
* .pipeThrough(new JsonStringifyStream());
*
* assertEquals(await Array.fromAsync(stream), [
* `{"foo":"bar"}\n`,
* `{"baz":100}\n`
* ]);
* ```
*
* @module
*/
Expand Down
Loading