-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use pipeline inside an infinite while loop #45861
Comments
|
@mscdex, thank you for the quick response your solution works like charm. Thank you once again 👏🏻 I have a follow-up question here — import fetch from 'node-fetch';
import {Readable, pipeline, Transform};
import exceljs from 'exceljs';
import util from 'util';
const cpipeline = util.promisify(pipeline);
class ExcelStream extends Transform {
constructor(options = {}) {
super({ ...options, objectMode: true });
this.i = 0;
const o = {
filename: "./test-file.xlsx",
useStyles: true,
useSharedStrings: true,
};
this.workbook = new exceljs.stream.xlsx.WorkbookWriter(o);
this.workbook.creator = "ABCDEF";
this.workbook.created = new Date();
this.sheet = this.workbook.addWorksheet("Discussion Event");
this.sheet.columns = [
{
header: "testColumn",
width: 10,
style: {
font: {
bold: true,
},
},
},
];
}
async _transform(chunk, encoding, done) {
try {
this.i += 1;
this.sheet.addRow([chunk]).commit();
} catch (error) {
console.log("Error while transforming data - commit failed");
}
}
async _flush(cb) {
try {
this.sheet.commit();
await this.workbook.commit();
} catch (error) {
console.log("Error cannot commit workbook");
}
}
}
const ss = new ExcelStream();
let i = 0;
while (true) {
const result = await (
await fetch(URL, {
method: "POST",
body: JSON.stringify(data),
headers: {
"Content-Type": "application/json",
Connection: "keep-Alive",
},
})
).json();
await cpipeline(
Readable.from(result.data),
async function* transform(src) {
for await (const chunk of src) {
yield JSON.stringify(chunk);
}
},
ss
);
i += 1;
if (i == 500) break;
} I get the same warning as mentioned in the original issue. How to fix it? |
AFAIK there is currently no way to prevent |
After removing the
import fetch from 'node-fetch';
import {Readable, pipeline, Transform};
import exceljs from 'exceljs';
import util from 'util';
const cpipeline = util.promisify(pipeline);
class ExcelStream extends Transform {
constructor(options = {}) {
super({ ...options, objectMode: true });
this.i = 0;
const o = {
filename: "./test-file.xlsx",
useStyles: true,
useSharedStrings: true,
};
this.workbook = new exceljs.stream.xlsx.WorkbookWriter(o);
this.workbook.creator = "ABCDEF";
this.workbook.created = new Date();
this.sheet = this.workbook.addWorksheet("Discussion Event");
this.sheet.columns = [
{
header: "testColumn",
width: 10,
style: {
font: {
bold: true,
},
},
},
];
}
async _transform(chunk, encoding, done) {
try {
this.i += 1;
this.sheet.addRow([chunk]).commit();
} catch (error) {
console.log("Error while transforming data - commit failed");
}
}
async _flush(cb) {
try {
this.sheet.commit();
await this.workbook.commit();
} catch (error) {
console.log("Error cannot commit workbook");
}
}
}
const ss = new ExcelStream();
let i = 0;
while (true) {
const result = await (
await fetch(URL, {
method: "POST",
body: JSON.stringify(data),
headers: {
"Content-Type": "application/json",
Connection: "keep-Alive",
},
})
).json();
const r = Readable.from(result.data);
r.pipe(ss, { end: false });
if (i === 5) {
r.destroy(); // why destroy? why can't r.emit('close')
break;
}
i += 1;
}
ss.end()
|
|
You can use import { pipeline } from 'node:stream/promises';
await pipeline(stream1, stream2, { end: false }); |
@aduh95 It seems that is undocumented? |
Hey @aduh95, thank you for the reply. Would it be possible to answer my follow-up query in the comment If you're saying this: import { pipeline } from 'node:stream/promises';
await pipeline(stream1, stream2, { end: false }); Then I'm getting an error in the second iteration of while loop and program terminates. rror [ERR_STREAM_WRITE_AFTER_END]: write after end
at new NodeError (node:internal/errors:371:5)
at _write (node:internal/streams/writable:319:11)
at ExcelStream.Writable.write (node:internal/streams/writable:334:10)
at pump (node:internal/streams/pipeline:150:21)
at processTicksAndRejections (node:internal/process/task_queues:96:5) {
code: 'ERR_STREAM_WRITE_AFTER_END'
} |
Could you provide a repository to reproduce the problem? Also include the server. Something you could also do: import fetch from "node-fetch";
import { Readable, Writable, pipeline } from "stream";
import { once } from "events";
const ss = new fs.createWriteStream("output.txt");
let i = 0;
while (true) {
const result = await (
await fetch(URL, {
method: "POST",
body: JSON.stringify(data),
headers: {
"Content-Type": "application/json",
Connection: "keep-Alive",
},
})
).json();
for await (const chunk of Readable.from(result.data)) {
const toWrite = JSON.stringify(chunk);
if (!ss.write(toWrite)) {
await once(ss, 'drain');
}
)
i += 1;
if (i == 500) break;
} |
Hey @mcollina, thank you for the reply! Here is the repo link and please go through the |
Here is my solution to this problem: import fetch from 'node-fetch';
import {
Readable,
pipeline,
Writable
} from 'stream';
import exceljs from 'exceljs';
import util from 'util';
const URL = 'https://jsonplaceholder.typicode.com/posts/'
const cpipeline = util.promisify(pipeline);
class ExcelStream extends Writable {
constructor(options = {}) {
super({
...options,
objectMode: true
});
this.i = 0;
const o = {
filename: "./test-file.xlsx",
useStyles: true,
useSharedStrings: true,
};
this.workbook = new exceljs.stream.xlsx.WorkbookWriter(o);
this.workbook.creator = "ABCDEF";
this.workbook.created = new Date();
this.sheet = this.workbook.addWorksheet("Discussion Event");
this.sheet.columns = [{
header: "testColumn",
width: 10,
style: {
font: {
bold: true,
},
},
}, ];
}
_write (chunk, encoding, done) {
console.log(chunk)
try {
this.i += 1;
this.sheet.addRow([chunk]).commit();
} catch (error) {
console.log("Error while transforming data - commit failed");
done(error)
return
}
done()
}
_flush(cb) {
try {
this.sheet.commit();
this.workbook.commit().then(() => cb(), cb);
} catch (error) {
console.log("Error cannot commit workbook");
}
}
}
async function run() {
let i = 1;
await cpipeline(async function * () {
while (true) {
let u = URL + i
const response = await fetch(u)
const result = await response.json()
if (result) {
yield result
} else {
break
}
i++
if (i == 20) break
}
}, new ExcelStream())
}
run() |
A note for future readers, fetch's response body is a readable stream. To pipe them to the same write stream outside of the loop's block scope, you can do: import { pipeline } from "node:stream/promises";
const writeStream = fs.createWriteStream("output.txt");
while (true) {
const response = await fetch(url, { /* ... */ });
await pipeline(response.body, writeStream, { end: false });
} |
Hi there, I'm new to the stream API. I'm finding it hard to understand what is wrong with my codebase and what could be the exact fix for my problem. Suppose there is an API with 500 pages. I need to stream data from that API (all 500 pages) and stream it to the destination. Below is a code example:
When I run the above code: I get the following warnings:
(node:9561) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [Stream]. Use emitter.setMaxListeners() to increase limit
As per my understanding,
ss
is ended after the first stream of data, so in the second iteration, when i = 1ss
is already closed, and it throws an error. Not sure how to keep it open or how to openss
again for writing.The text was updated successfully, but these errors were encountered: