Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipeline causes script to silently crash #38539

Closed
webdevlocalhost opened this issue May 4, 2021 · 9 comments
Closed

pipeline causes script to silently crash #38539

webdevlocalhost opened this issue May 4, 2021 · 9 comments
Labels
duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem.

Comments

@webdevlocalhost
Copy link

Version: 16
Platform: Linux webdev-g7 5.4.0-72-generic #80-Ubuntu SMP Mon Apr 12 17:35:00 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux

const fs = require('fs')
const { pipeline:pipelineSync } = require('stream')
const { promisify } = require('util')
const pipeline = promisify(pipelineSync)

// I'm targeting v14.14 which doesn't have stream/promises
// const { pipeline } = require('stream/promises')

// The goal with this script was _testing_ if an append file descriptor
// could be reused in pipeline to append one file then another, just to see...
// however the script fails silently no error
// doing an --inspect-brk looks like it gets stuck in some active hooks loop
// but I'm not sure... I tested with 14.14 and 15, 16 versions

async function main() {
  debugger
  await fs.promises.writeFile('tmp/a', Buffer.alloc(1024, 0x0d))
  await fs.promises.writeFile('tmp/b', Buffer.alloc(1024, 0x0a))

  try {
    const fh = await fs.promises.open('tmp/c', 'a')
    // Create write stream that doesn't close from "append" file handler
    const output = fs.createWriteStream(null, { fd: fh.fd, autoClose: false })
    // append contents from tmp/a
    const readA = fs.createReadStream('tmp/a')
    await pipeline(readA, output)
    // append contents from tmp/b but script silently fails here
    // console.log('done.') never fires
    // no exceptions get thrown
    const readB = fs.createReadStream('tmp/b')
    await pipeline(readB, output)
    await fh.close()
  } catch (e) {
    console.log(e)
  }

  console.log('done.')
}

main()

How often does it reproduce? Is there a required condition?

Everytime script is run

What is the expected behavior?

I see "done." it appended the two files.

What do you see instead?

Nothing it silently crashes

Additional information

Here is a workaround with notes https://gist.github.com/webdevlocalhost/32822034a003d93f0452dff4d48e73c8

@Linkgoron
Copy link
Member

Linkgoron commented May 4, 2021

Pipeline internally calls Readable.pipe to pipe the Readable into the Writable, and pipe calls stream.end() after it finishes. You can see this by adding console.log(output.writableFinished); after your first pipeline. I assume that the crash comes from trying to pipe into a closed fd.

In general, pipeline states that it closes all streams given to it (if they are not closed by themselves), and the documentation officially discourages reusing streams that have been used in pipeline.

https://nodejs.org/api/stream.html#stream_stream_pipeline_streams_callback
stream.pipeline() leaves dangling event listeners on the streams after the callback has been invoked. In the case of reuse of streams after failure, this can cause event listener leaks and swallowed errors.

@webdevlocalhost
Copy link
Author

I understand now pipeline ending the stream even though it's state is closed:false,destroyed:false,ended:true,finished:true,writing:false

Is a "swallowed error" where a script will abruptly end no errors?

In my script an error will trigger if output.write() is called after the first pipeline, the error is about writing to a stream already ended.

However pipeline by design will "swallow the error" and silently end the script?

@Linkgoron
Copy link
Member

Linkgoron commented May 4, 2021

It's unrelated to the pipeline, you're writing to a closed fd which causes undefined behaviour and I assume causes errors with printing to the console.

@webdevlocalhost
Copy link
Author

const fs = require('fs')
const { pipeline:pipelineSync } = require('stream')
const { promisify } = require('util')
const pipeline = promisify(pipelineSync)

async function main() {
  try {
    await fs.promises.writeFile('tmp/a', Buffer.alloc(1024, 0x0d))
    await fs.promises.writeFile('tmp/b', Buffer.alloc(1024, 0x0a))
    const fh = await fs.promises.open('tmp/c', 'a')
    let output = fs.createWriteStream(null, { fd: fh.fd, autoClose: false })
    const readA = fs.createReadStream('tmp/a')
    await pipeline(readA, output)
    await fh.write('foobar')
    // uncomment below for swallowed error?
    // const readB = fs.createReadStream('tmp/b')
    // await pipeline(readB, output)
    await fh.close()
  } catch (e) {
    console.log(e)
  }

  console.log('done.')
}

main()

So the above script when the 2 lines are un-commented you get output in your console?

@webdevlocalhost
Copy link
Author

const fs = require('fs')
const { pipeline:pipelineSync } = require('stream')
const { promisify } = require('util')
const pipeline = promisify(pipelineSync)

// Linux cleanup
// const { execSync } = require('child_process')
// execSync('rm -rf tmp')
// execSync('mkdir tmp')

async function main() {
  try {
    await fs.promises.writeFile('tmp/a', Buffer.alloc(1024, 0x0d))
    await fs.promises.writeFile('tmp/b', Buffer.alloc(1024, 0x0a))
    const fh = await fs.promises.open('tmp/c', 'a')
    let output = fs.createWriteStream(null, { fd: fh.fd, autoClose: false })
    const readA = fs.createReadStream('tmp/a')
    await pipeline(readA, output)
    const readB = fs.createReadStream('tmp/b')
    await fh.write('foobar')

// Stream .write() Error Example:
// calling .write on the WritableStream where 'finish' event triggered
// ERR_STREAM_WRITE_AFTER_END error getting thrown, makes perfect sense
// the fd is still open but the Writable internal state is done 
    // output.write('test')

// Recreating the WritableStream using the same fd will work just fine
// Same unclosed fd but new write stream where 'finish' hasn't fired yet
// this makes sense to me and code works fine with it
    // output = fs.createWriteStream(null, { fd: fh.fd, autoClose: false })

// Leaving the above code lines commented silently ends the script no errors
// or exceptions

// The docs talk about errors getting swallowed with pipeline,
// The writing didn't include examples and wasn't clear to me.
// Not being able to throw an error when there's a write_stream thats
// not closed but isn't writable seems off

// Would it be possible to have pipeline behave like the .write
// where an ERR_STREAM_WRITE_AFTER_END gets thrown

    await pipeline(readB, output)
    await fh.close()
  } catch (e) {
    console.log(e)
  }

  console.log('done.')
}

main()

@webdevlocalhost
Copy link
Author

Previous versions of node the second pipeline call fails as expected

node versions 14 and up the above code the error gets swallowed and script exits abruptly at second pipeline call

interesting that node version 13 the error changes from write stream end to bad file descriptor

EXAMPLE USING NVM:

nvm install 10 && nvm use 10 && node temp.js
v10.24.1 is already installed.
Now using node v10.24.1 (npm v6.14.12)
Now using node v10.24.1 (npm v6.14.12)
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at writeAfterEnd (_stream_writable.js:248:12)
at WriteStream.Writable.write (_stream_writable.js:296:5)
at ReadStream.ondata (_stream_readable.js:710:20)
at ReadStream.emit (events.js:198:13)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at ReadStream.Readable.push (_stream_readable.js:224:10)
at lazyFs.read (internal/fs/streams.js:181:12)
at FSReqWrap.wrapper [as oncomplete] (fs.js:467:17)
done.

nvm install 11 && nvm use 11 && node temp.js
v11.15.0 is already installed.
Now using node v11.15.0 (npm v6.7.0)
Now using node v11.15.0 (npm v6.7.0)
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at writeAfterEnd (_stream_writable.js:248:14)
at WriteStream.Writable.write (_stream_writable.js:296:5)
at ReadStream.ondata (_stream_readable.js:705:22)
at ReadStream.emit (events.js:193:13)
at addChunk (_stream_readable.js:295:12)
at readableAddChunk (_stream_readable.js:276:11)
at ReadStream.Readable.push (_stream_readable.js:231:10)
at fs.read (internal/fs/streams.js:183:12)
at FSReqCallback.wrapper [as oncomplete] (fs.js:478:5)
done.

nvm install 12 && nvm use 12 && node temp.js
v12.22.1 is already installed.
Now using node v12.22.1 (npm v6.14.12)
Now using node v12.22.1 (npm v6.14.12)
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at writeAfterEnd (_stream_writable.js:266:14)
at WriteStream.Writable.write (_stream_writable.js:315:5)
at ReadStream.ondata (_stream_readable.js:718:22)
at ReadStream.emit (events.js:314:20)
at addChunk (_stream_readable.js:297:12)
at readableAddChunk (_stream_readable.js:272:9)
at ReadStream.Readable.push (_stream_readable.js:213:10)
at internal/fs/streams.js:215:14
at FSReqCallback.wrapper [as oncomplete] (fs.js:520:5) {
code: 'ERR_STREAM_WRITE_AFTER_END'
}
done.

nvm install 13 && nvm use 13 && node temp.js
v13.14.0 is already installed.
Now using node v13.14.0 (npm v6.14.4)
Now using node v13.14.0 (npm v6.14.4)
[Error: EBADF: bad file descriptor, write] {
errno: -9,
code: 'EBADF',
syscall: 'write'
}
done.

@Linkgoron
Copy link
Member

Linkgoron commented May 5, 2021

Since Node 14, destroyed streams no longer emit errors #29197, that's why you've seen errors until Node 13.

What you're seeing is a bug with pipeline, and is essentially a duplicate of #36674 and will be fixed when #36791 is merged (hopefully soon).

Regarding Node shutting down - I retract my statement regarding closed FDs, it was totally wrong. It looks like there really is no crash. The event loop is totally empty and Node just exits. Something similar to the following:

async function func() {
    const promise = new Promise(()=>{});
    await promise;
    console.log('bye');
}

func();

@Linkgoron Linkgoron added duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem. labels May 6, 2021
@benjamingr
Copy link
Member

Should be fixed in master 🎉

@etiennebatise
Copy link

I arrive here after the battle. I encountered a very similar issue early 2021. I was refactoring some code today, I just found this problem again. Following @benjamingr last comment, I upgraded to node 18.2 to run the code and I can still trigger the Node exit. Any chance the bug has not been fixed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

No branches or pull requests

4 participants