Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

writeFileListToStream might hang when the writable stream becomes full (asar does not work between Node 15.2.0 and 15.4.0) #210

Closed
weedz opened this issue Dec 15, 2020 · 9 comments

Comments

@weedz
Copy link

weedz commented Dec 15, 2020

This problem occurs when using [email protected]+ under Linux (Pop!_OS 20.10). Also tested on Mac OS 11.1 (node 15.1, 15.2 and 15.4) but no problem there. EDIT: Was able to reproduce this on the Mac setup. Might just be that the Macbook is to slow to trigger this problem reliably...

This might be the relevant change in node nodejs/node#35348.

This is reproducible (at least on my machine) in my repository https://github.com/weedz/git-good. Running the command

$ npx electron-forge package

finishes when using [email protected], but it hangs when using [email protected] (also tested with version 15.4).

I then changed writeFileListToStream to wait for out.writableNeedDrain to become false. Now electron-forge package finishes when using [email protected], but this could probably be solved in a better way..

+ function sleep(ms) {
+     return new Promise((resolve, reject) => setTimeout(resolve, ms));
+ }

const writeFileListToStream = async function (dest, filesystem, out, list, metadata) {
    for (const file of list) {
+       while (out.writableNeedDrain) {
+           await sleep(10);
+       }
        if (file.unpack) { // the file should not be packed into archive
            const filename = path.relative(filesystem.src, file.filename)
            await copyFile(`${dest}.unpacked`, filesystem.src, filename)
        } else {
            await streamTransformedFile(file.filename, out, metadata[file.filename].transformed)
        }
    }
    return out.end()
}
@weedz
Copy link
Author

weedz commented Dec 16, 2020

Might be a bug in Node?

Documentation for readable.pipe() reads:

... The flow of data will be automatically managed so that the destination Writable stream is not overwhelmed by a faster Readable stream.

Is this still the case when piping multiple Readable streams to the same Writable stream?

@malept
Copy link
Member

malept commented Dec 16, 2020

@weedz do you mind following up in the Node.js project?

@weedz
Copy link
Author

weedz commented Dec 16, 2020

@malept Sure

@HomerSp
Copy link

HomerSp commented Dec 18, 2020

This seems to be happening for me as well when attempting to deploy https://github.com/yakyak/yakyak against node v15.4.0. It's not exactly the same I think, because randomly through the asar generating process it'll just stop and fail and not really hang - however, it also won't continue processing files in the for loop.
The workaround fixes it for me, but as you've said this really should be properly fixed in Node.js. If I run strace it also works properly, most likely because it slows down the process quite a lot, so it definitely seems like the output buffer needs some time to settle.

I've added some log output to asar, and here are the relevant last lines when attempting to deploy yakyak:
writeFileListToStream {
filename: '/tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/dist/esprima.js',
unpack: false
}
streamTransformedFile /tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/dist/esprima.js
streamTransformedFile done
stream.close
stream.data
stream.data
stream.data
stream.data
stream.data
stream.end
writeFileListToStream done
writeFileListToStream {
filename: '/tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/LICENSE.BSD',
unpack: false
}
streamTransformedFile /tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/LICENSE.BSD
streamTransformedFile done
stream.close
[02:04:08] The following tasks did not complete: deploy:linux-x64:nodep
[02:04:08] Did you forget to signal async completion?

@weedz
Copy link
Author

weedz commented Dec 18, 2020

@HomerSp Looked into why it seems like electron-forge package hangs rather than just exiting; The cli runs a spinner in the terminal window with packagerSpinner = ora('Packaging Application').start();. If you comment out this line the program exits (with code 0 and without any form of errors) instead of waiting for packagerSpinner.succeed().
Looked at your deploy script and it seems like you pragmatically call packager which of course does not run the spinner and thus just exits when the problem occurs.

Also ran strace on electron-forge package: https://gist.github.com/weedz/c856712a21881d00edfdd33043d89b8d, think this is the relevant output. I can't see any difference but I don't really know what to look for.

EDIT: Ran strace on a script which pragmatically runs packager but this worked properly just as you observed, was alot slower so the stream probably had time to properly drain.

@HomerSp
Copy link

HomerSp commented Dec 18, 2020

@weedz I've done some more testing, and I've found that ultimately the problem happens because of this change from that PR above: https://github.com/nodejs/node/pull/35348/files/3dd02b3c3c675d86c6be17b6fdb8e8994a1d65ed#diff-040c1f5a53844e600d40b33c4624f1fe39fcf2f8d62c76ca3fc5ea5442231469
Basically, when the output buffer is full the pipe opens up in a paused state, but it's never resumed which causes the problem we're seeing. If I revert just that block of code it works as expected again.
Another way to get it to work, apart from your while loop solution above is to call stream.resume after the pipe call.

Hopefully they can revert that part of the pr for the next release so it'll work as it should again.

@HomerSp
Copy link

HomerSp commented Dec 20, 2020

This issue has now been fixed in nodejs/node#36563

@weedz
Copy link
Author

weedz commented Dec 20, 2020

Awesome!

Closing this issue since the bug has been resolved in Node 👍

@weedz weedz closed this as completed Dec 20, 2020
@malept malept changed the title writeFileListToStream might hang when the writable stream becomes full writeFileListToStream might hang when the writable stream becomes full (asar does not work between Node 15.2.0 and 15.4.0) Dec 22, 2020
@malept
Copy link
Member

malept commented Dec 22, 2020

For the record, the affected versions of Node.js were 15.2.0 through 15.4.0 inclusive. The fix mentioned was released in Node.js 15.5.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants