-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@@asyncIterator doesn't seem to work on large directories #97
Comments
What about a simple stream? Also try #92. |
I've manually applied #92, yet the problem persists. Simple stream works as expected though. |
Try |
No luck either. |
Hmm... I'm also having same error. Promise and stream method correct work and pass around 100k entries, but async iterator pass just 134. With (async () => {
process.on('beforeExit', () => console.log('beforeExit'));
process.on('exit', () => console.log('exit'));
let i = 0;
for await (const entry of readdirp('.')) {
console.log(i++, entry.path);
}
})()
.then(() => console.log('end'))
.catch(err => console.error(err)); Environment: NodeJS v10.13.0 and v12.6.0 without any transpilation on Debian 9.4. UPD: I think it's really bug in NodeJS because next code snippet causes similar behavior with unexpected exit: const iterator = require('fs').createReadStream('package-lock.json').setEncoding('utf-8')[Symbol.asyncIterator]();
const content = await require('p-map')(Array.from({ length: 200 }), async () => {
await iterator.next();
console.log('iteration', counter + 1);
return counter++;
}, { concurrency: 10 }) Yes, that's little bit another situation with parallel iterator execution, but the effect looks alike. |
First 471 entries are equal to the ones returned by
readdirp.promise
.No unhandled rejections, no exceptions,
0
exit code, not out of memory.I am running on macOS, here is the
dir
.The text was updated successfully, but these errors were encountered: