Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

@@asyncIterator doesn't seem to work on large directories #97

Open
shvaikalesh opened this issue Jun 21, 2019 · 5 comments
Open

@@asyncIterator doesn't seem to work on large directories #97

shvaikalesh opened this issue Jun 21, 2019 · 5 comments

Comments

@shvaikalesh
Copy link
Contributor

shvaikalesh commented Jun 21, 2019

"use strict"
const readdirp = require("readdirp")
const dir = "LayoutTests"

;(async () => {
  let promise = await readdirp.promise(dir)
  console.log(promise.length) // 196246

  let count = 0
  for await (let entry of readdirp(dir)) {
    console.log(entry.path, ++count) // goes up to 471 (weird, right?)
  }

  console.log("unreachable") // process exits before this line
})()

First 471 entries are equal to the ones returned by readdirp.promise.
No unhandled rejections, no exceptions, 0 exit code, not out of memory.
I am running on macOS, here is the dir.

@paulmillr
Copy link
Owner

What about a simple stream? Also try #92.

@shvaikalesh
Copy link
Contributor Author

I've manually applied #92, yet the problem persists. Simple stream works as expected though.

@paulmillr
Copy link
Owner

Try alwaysStat: true.

@shvaikalesh
Copy link
Contributor Author

No luck either.

@CheerlessCloud
Copy link

CheerlessCloud commented Jul 22, 2019

Hmm... I'm also having same error. Promise and stream method correct work and pass around 100k entries, but async iterator pass just 134. With alwaysStat: true it passes 127 entries! And for some reason, the console.log('end') is never called, only exit and beforeExit handlers were called. Yep, that's very strange 🤷‍♂️

(async () => {
    process.on('beforeExit', () => console.log('beforeExit'));
    process.on('exit', () => console.log('exit'));

    let i = 0;
    for await (const entry of readdirp('.')) {
        console.log(i++, entry.path);
    }
})()
    .then(() => console.log('end'))
    .catch(err => console.error(err));

Environment: NodeJS v10.13.0 and v12.6.0 without any transpilation on Debian 9.4.

UPD: I think it's really bug in NodeJS because next code snippet causes similar behavior with unexpected exit:

const iterator = require('fs').createReadStream('package-lock.json').setEncoding('utf-8')[Symbol.asyncIterator]();
const content = await require('p-map')(Array.from({ length: 200 }), async () => {
  await iterator.next();
  console.log('iteration', counter + 1);
  return counter++;
}, { concurrency: 10 })

Yes, that's little bit another situation with parallel iterator execution, but the effect looks alike.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants