-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stream.pipeline destroys writable stream when error is occurred #26311
Comments
The whole reason of pipeline is to call destroy(). If you don’t want this behavior you can use plain old source.pipe(). |
hm. I would like to use Promise and Stream seamlessly, stream.pipeline is seems to be helpful for my situation in API doc. const pipeline = util.promisify(stream.pipeline);
async function run() {
await pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz')
);
console.log('Pipeline succeeded.');
} Anyway, I understood destroy function does not catch error objectI have tried to create custom const fs = require('fs')
const http = require('http')
const { pipeline, Writable } = require('stream')
class ResponseWriter extends Writable {
constructor(options) {
super(options);
this.res = options.res;
}
_write(chunk, encoding, callback) {
this.res.write(chunk);
this.data += chunk;
callback();
}
_final(callback) {
this.data += this.res.end();
callback();
}
_destroy(err, callback) {
console.log("destroy", err); // does not show error.
this.res.end("ERROR!! " + err);
this.res.destroy(err);
callback(err);
}
}
const server = http.createServer((req, res) => {
const r = fs.createReadStream('./foo2.txt')
const w = new ResponseWriter({
res,
});
pipeline(r, w, (err) => {
console.log(err) // No such file
});
})
server.listen(9000) But this solution does not work well. cause in node.js core, if (typeof stream.destroy === 'function') return stream.destroy(); // destroy(err) ?? Is this expected behavior? If not, I will send PR. |
This is expected behavior. Note that from an http standpoint, adding I think the source of these problems is not pipeline or pipe, but rather that the implementation of |
I agree with that, but my opinion is different.
https://nodejs.org/api/stream.html#stream_stream_finished_stream_options_callback I know |
I'm willing to discuss adding new features/options to |
@yosuke-furukawa
@mcollina what do you think? does it make any sense to add this trick(with PassThrough) into the documentation? This problem appears quite often |
I don't understand how the above trick helps. |
a changed code sample from @yosuke-furukawa
and we run the server and call:
the server output:
so, I achieve the expected behavior in a simple way without changing the interface. Do you see my point now? |
I still don't understand why adding a passthrough in the middle is going to fix this problem. The correct fix for this issues is to wait that the file is correctly opened, and then use |
yep. but in a case of HttpResponse you don't want to destroy it.
I just changed the code from @yosuke-furukawa
it's not about this particular case. This case is just simple enough to show the concept. There is a lot of cases when you need to pipe some read stream to a transformation stream and to response(e.g., check my image resizing sample). And you need a way to handle possible ALL errors which can appear. it was probably the reason why the original pump from @mafintosh was returning the last stream in the end. |
I am in favour of adding options to |
We'll be happy to evaluate that change, send it over. |
What about: const r = fs.createReadStream('./foo2.txt')
await finished(pipeline(r, transform, (err) => {})
.on('error', err => {
console.log(err) // No such file
res.end("error!!!"); // write error response but this message is not shown.
})
.pipe(res)
) |
Got bit by this today as it killed some connections eagerly rather than allowing a 500 to be returned. |
Due to #34133 resolution, this issue is still a thing? |
No, this can be closed. |
There are some disagreements in this feature and no consensus, looks like it was the reason that #34133 was closed. Otherwise, I can re-create that PR with some adjustments. |
It seems that PR was closed due to:
Having a fresh PR would help, I think we can get this landed. Wdyt @ronag? |
I agree. |
Any good ideas, maybe I can reopen this PR. |
There has been no activity on this feature request for 5 months and it is unlikely to be implemented. It will be closed 6 months after the last non-automated comment. For more information on how the project manages feature requests, please consult the feature request management document. |
Closed by #41796 |
For others like me looking for a workaround for this: const makeIndestructible = (stream: NodeJS.WritableStream) => {
return new Writable({ write: stream.write.bind(stream) })
}
const source = createReadStream(filePath)
await pipeline(source, makeIndestructible(sink)) Although the wrapping |
Version:
v11.10.0
Platform:
Mac OS Darwin 16.7.0
Subsystem:
stream, http
stream.pipeline is helpful to handle error and interoperable to Promise.
However, I found a behavior that is not suitable my usecase.
I am creating a web server with stream.pipeline.
If my readable stream emits an error like "file not found", I would like to send
error
response to my clients.code example is as follows.
I have investigated nodejs core, stream.pipeline destroys writable stream when error is occurred.
https://github.com/nodejs/node/blob/master/lib/internal/streams/pipeline.js#L42
so the above code cannot send error response.
Question
Is this an expected behaviour?
In my understandings, writable stream should be closed anyway, but in this case, we would like to close writable stream manually.
In this situation, we need to create custom writable stream?
I could send a PR to pass an option like
{ destroyOnError: false }
to avoid destory automatically.The text was updated successfully, but these errors were encountered: